reconhell wrote:
flowreen91 wrote:

Now that Nvidia launched the Nvidia App on latest update that turns SDR games into HDR:
https://www.nvidia.com/en-us/software/nvidia-app/
did people started playing around with the RTX HDR & RTX Dynamic Vibrance to watch their non-HDR videos with it ?
You can easily enable it on your video player by drag and dropping it in the NvTrueHDR app:
https://www.nexusmods.com/site/mods/781
https://i.ibb.co/8BZJvMT/Desktop-Screenshot.png

How did you get this working i dragged and dropped my mPC-hC exe into the nvtruehdr.exe and it said hdr on settings saved. but when i start the app and go to game filter nothing is shown for gameprofile!! like in your screenshot...

To get that menu, you have to use the new Nvidia app that's in beta. Doesn't work with multiple monitors btw. https://www.nvidia.com/en-us/software/nvidia-app/

2

(3 replies, posted in Using SVP)

arizer wrote:
DragonicPrime wrote:
arizer wrote:

I have exclusive fullscreen enabled in MPC video renderer, have tried switching all the available settings but my LG CX fps meter stays at 118fps when playing video at 48fps or whatever fps.
Edit: I actually got it working in MPV now, but would prefer MPC-HC.

How did you get it working in MPV? It used to work for me, then it just stopped at one point and I haven't been able to enable it again

I have these in my mpv.conf which is located at c:\Users\username\AppData\Roaming\mpv. Create the file if it's not there.


fullscreen=yes
d3d11-exclusive-fs=yes
input-ipc-server=mpvpipe

volume=100
volume-max=100
video-sync=audio
hr-seek-framedrop=no

fbo-format=rgba16hf

vo=gpu-next
gpu-api=d3d11
hwdec=auto-copy
hwdec-codecs=all
gpu-context=d3d11
drm-vrr-enabled=no
d3d11-sync-interval=1

This is my full mpv.conf which also enables Dolby Vision decoding:
"target-peak=800" is screen's maximum peak brightness in nits, in my case (LG CX) it's 800 nits.


fullscreen=yes
d3d11-exclusive-fs=yes
input-ipc-server=mpvpipe

volume=100
volume-max=100
video-sync=audio
hr-seek-framedrop=no

fbo-format=rgba16hf

vo=gpu-next
gpu-api=d3d11
hwdec=auto-copy
hwdec-codecs=all
gpu-context=d3d11
drm-vrr-enabled=no
d3d11-sync-interval=1

target-peak=800
target-trc=pq
target-prim=bt.2020

tone-mapping=spline
tone-mapping-mode=luma
gamut-mapping-mode=perceptual
tone-mapping-param=bt.2390
tone-mapping-max-boost=2.0

dither=error-diffusion
error-diffusion=burkes
dither-depth=10
spirv-compiler=auto

target-contrast=inf
target-colorspace-hint=yes
hdr-compute-peak=yes

osd-level=1
osd-bar-w=25
osd-color=0.5
osd-bar-align-x=0
osd-bar-align-y=-1

scale=ewa_lanczos
cscale=ewa_lanczos
dscale=ewa_lanczos
tscale=ewa_lanczos

That seems to have done it. On the LG C1. I already had a lot of those settings in my config file, but I guess there must've been a typo somewhere that messed up G-Sync. Thanks

3

(3 replies, posted in Using SVP)

arizer wrote:

I have exclusive fullscreen enabled in MPC video renderer, have tried switching all the available settings but my LG CX fps meter stays at 118fps when playing video at 48fps or whatever fps.
Edit: I actually got it working in MPV now, but would prefer MPC-HC.

How did you get it working in MPV? It used to work for me, then it just stopped at one point and I haven't been able to enable it again

ategetemen wrote:

Hello guys.♥
I have downloaded and activated RIFE AI engine. But every movie I run, it's still on automatic mode!
I resrtart the PC, but it was not fixed.
Please help me to disable the program from automatic mode.
i have windows 11 & RTX 4090.


https://imgtr.ee/image/I7C5vD
https://imgtr.ee/image/I7C69s

Either use the button on the bottom saying apply if, and make it apply to any video over 1fps or something, or click the SVP logo, then click "profile for active video" and use RIFE

Xenocyde wrote:
aloola wrote:
Blackfyre wrote:

New nVidia driver, 545.84 WHQL



I assume this doesn't impact us, right? Would be nice if we can get a 2x performance lift.

I've just tested with the new driver, no performance boost

Maybe the devs need to update the RIFE model first? Not sure if it makes any difference as the performance boost is specifically targeted at Stable Diffusion image generation, so it might not work for frame generation. Maybe the improved frame generation included with DLSS 3.5 could help RIFE?

As far as I know, DLSS 3.5 only helped the denoising technique used in Ray Tracing workloads like Path Tracing in cyberpunk. Frame Generation doesn't look any better or run any better

Is there a tutorial somewhere on how to enable Rife on MPC-HC? I've done it before but it doesn't even seem to turn on now. I remember having to do something, but I can't seem to find that guide on here anymore. I might just be blind
Edit: okay nvm I'm not blind just dumb lol. Figured it out

dawkinscm wrote:
DragonicPrime wrote:
dawkinscm wrote:

I run SVP with 1920x2160 files. With 4.7 GPU utilisation averages about 75%. With 4.6 the average was around 53%. That's a pretty big jump which as I said I'm fine with. But I do wonder if there is any optimisation that is still to be done or is the new model simply more dense with information so it is what it is?

I just tried the latest update(was about 20 hours ago at the time of me writing this), and it seems version 4.7 is slightly easier to run now. It's still harder than 4.6, but from the small bit of testing I was able to do for now, it seems slightly better. Hopefully it gets some more optimizations to make it about the same as 4.6

It might be to do with the upgrade of mpv to the latest version. But I've been using the latest version for a few of months now so it made zero difference for me.

oh maybe. I just updated when I got the option through SVP

dawkinscm wrote:

I run SVP with 1920x2160 files. With 4.7 GPU utilisation averages about 75%. With 4.6 the average was around 53%. That's a pretty big jump which as I said I'm fine with. But I do wonder if there is any optimisation that is still to be done or is the new model simply more dense with information so it is what it is?

I just tried the latest update(was about 20 hours ago at the time of me writing this), and it seems version 4.7 is slightly easier to run now. It's still harder than 4.6, but from the small bit of testing I was able to do for now, it seems slightly better. Hopefully it gets some more optimizations to make it about the same as 4.6

This version seems to run worse than 4.6 for me. GPU usage is higher. It doesn't seem to be much higher, but it's noticeable

10

(0 replies, posted in Using SVP)

I know you can use mpv upscalers along with SVP, but I recently found a new upscaler for MPV that also uses the Tensor Cores in Nvidia GPUs. Rife uses the same thing, so I'm curious if there's any way of using them together. When I tried, only the first one enabled would work.
This is the github page for the upscaler: https://github.com/the-database/mpv-ups … animejanai
From what I tried, it does look really good, so it would be nice to be able to use both together

Drakko01 wrote:

Can someone remember me how to use ensemble model, in the model drop down I only see 4.4 and 4.6. I use Mpc-hc as a player.

copy the folder into SVP folder>rife>models. After that it should use it automatically. At least it does for me. If it worked correctly, it should pop up that command prompt window while it sets everything up for every resolution if you have performance mode enabled. Little warning. 4k content doesnt' seem to work very well with this version. The quality is noticeably worse for me and others in this thread if I remember correctly

UHD wrote:
DragonicPrime wrote:

Thanks for the post. This sounds interesting. I'm not a programmer or anything so idk how much I can help, but I'd love to test it and share my results when possible. The thing I'm most curious about is probably the performance. A big benefit of Rife is being able to run it in real time now, so would love to see how this new model would perform and how the quality is in different videos. I'm looking forward to seeing more on this.

Unfortunately, this method is relatively slow, at least compared to RIFE, see Table 1: https://arxiv.org/pdf/2211.11423.pdf

Tensor Cores optimisation will probably significantly speed up the inference. For now, however, we need to have a model that is most useful to us, then we will think about what to do next.

I don't think anything is going to replace RIFE in the near future in terms of performance speed, but for lower resolution video, for video encoding where the highest possible quality is needed we may be looking for something better, albeit slower.

If graphics cards double in power every 2 years, who knows if BiT won't replace RIFE for real-time interpolation in a while, even though it will be slower. After all, for 4K the bottleneck for RIFE is probably no longer the graphics card but RAM.

Then there is the issue of directing researchers to move towards Joint Video Deblurring and Frame Interpolation, which I want to do, through my project, which still needs a lot of work: https://github.com/AIVFI/Video-Frame-In … g-Rankings

Ya I figured it would be slow, I just didn't look through all the info since there was a lot to go through lol. Although like I said I'm willing to help and share results when the time comes. Looking forward to it

Thanks for the post. This sounds interesting. I'm not a programmer or anything so idk how much I can help, but I'd love to test it and share my results when possible. The thing I'm most curious about is probably the performance. A big benefit of Rife is being able to run it in real time now, so would love to see how this new model would perform and how the quality is in different videos. I'm looking forward to seeing more on this.

14

(32 replies, posted in Using SVP)

cemaydnlar wrote:
DragonicPrime wrote:
cemaydnlar wrote:

Is there a way to get a better quality without upscaling. I am using a 1080p monitor. I watch 1080p anime but can i make it look better with some sort of glsl ? Does anime4k work like this or do i need a 4k monitor ?

I used to use anime4k on my 1080p monitor and it still worked even in 1080p videos

Can u give me a fast guide for anime4k and how to use it ?

Like Fortune said, just check the github. Step by step instructions are all there. It's pretty easy

15

(32 replies, posted in Using SVP)

cemaydnlar wrote:

Is there a way to get a better quality without upscaling. I am using a 1080p monitor. I watch 1080p anime but can i make it look better with some sort of glsl ? Does anime4k work like this or do i need a 4k monitor ?

I used to use anime4k on my 1080p monitor and it still worked even in 1080p videos

16

(32 replies, posted in Using SVP)

dlr5668 wrote:
DragonicPrime wrote:

I just use Anime4k. Might not be as good as full upscalers, but it gets the job done for me and makes anime looks significantly better imo. here's the link in case anyone wants it. https://github.com/bloc97/Anime4K

Its pretty good. Just press ctrl+1 for 1080p anime and ctrl+2 or 3 for low quality old one

I actually like the look of CTRL+4 for 1080p anime. Much harder to run in real time, but imo it looks better

17

(32 replies, posted in Using SVP)

AutoClickers wrote:
DragonicPrime wrote:

I just use Anime4k. Might not be as good as full upscalers, but it gets the job done for me and makes anime looks significantly better imo. here's the link in case anyone wants it. https://github.com/bloc97/Anime4K

It doesn't work essentially at all for old anime-style content that is 480 or 360p. (Content that needs upscaling the most)

Ya it doens't work nearly as well for old anime at low resolutions, but figured I'd share it in case someone wanted to try it out and see how they like it, at least with new anime

18

(32 replies, posted in Using SVP)

I just use Anime4k. Might not be as good as full upscalers, but it gets the job done for me and makes anime looks significantly better imo. here's the link in case anyone wants it. https://github.com/bloc97/Anime4K

Grinchy wrote:

Someone else with Short Micro Lags?

It's perfectly smooth, but sometimes there are some short lags. Using an 5900x with RTX4080 and TensorRT

I was getting something similar when testing Dolby vision videos. Regular HDR or regular SDR videos I have no issues. Same CPU, but a 4090 instead

aloola wrote:
DragonicPrime wrote:

Tried to do both. Same result. Not sure what's wrong

try to use AIDA64 to benchmark your DDRAM first.
https://cdn.discordapp.com/attachments/290709370600423424/1081482725715873792/image.png

Almost a 1 to 1 copy of yours it seems
https://i.imgur.com/gOVpK8D.png

UHD wrote:

It looks like there is a problem and a serious one at that. I guess I can't help you any more than to advise you to install the GPU drivers cleanly and reinstall SVP from scratch as well.

3070 Ti:
https://www.svp-team.com/forum/viewtopi … 819#p81819

Tried to do both. Same result. Not sure what's wrong

UHD wrote:
DragonicPrime wrote:

I just tried it myself and I can't do x3 either with a 4090. I only have DDR4 ram at 3800MHZ, but it doesn't seem to be the problem. On that test video, at 3x interpolation, my 4090 is maxed out at 100% usage. at 2x with the recommended settings, it's at around 70% usage

This is puzzling. 3.8x more power and the GPU load drops to only 70% compared to 90% for the 3070 Ti. If you've applied the tips above and still nothing, then I guess you'll have to compare step by step with those who have managed to interpolate x3 in real time.

Same video file - the LG demo I suggested and encoding to start with. Write how many FPS you have for RIFE interpolation. We will compare the FPS of the encoding first. In the next post I will give further, my ideas and we will see how they will affect the FPS.

Well I don't think the fps I get when encoding will be of much help, since it doesn't make any sense. I'm only getting 18fps for some reason, even though I'm easily able to watch that same video in real time at 2x which means it's 50fps

UHD wrote:
Mardon85 wrote:

I have 6400mhz Dual Rank 64gb Sticks at netting just under 100Gb/s with a latency of 54Ns. Using MPV player i can't interpolate a 21:9 4k HDR film at 3x (13700k @ 5.6Ghz all core 4.5Ghz E cores).

You've been away from us for a month and a half. Nice to see you Mardon85 again smile

NVIDIA GeForce RTX 4090 graphics card and DDR5-6400 memory is an excellent set up to test the capabilities of RIFE. It's a bit puzzling that you can't achieve x3 interpolation, especially since if I understood one of your earlier posts correctly, for x2 interpolation the NVIDIA GeForce RTX 3090 graphics card was completely sufficient.

Mardon85 wrote:

BTW I was returning my watercooled 3090 back to stock today before selling on. I thought i'd give this latest version of Rife a go and it does run 4K HDR at 48FPS no issue.

So this isn't limited to 40 series cards. Power draw is low too, around 150w.

https://www.svp-team.com/forum/viewtopi … 686#p81686

Write more about the test parameters:

1 What is your maximum refresh rate of your monitor or TV set? 100Hz? 120Hz?
2 Are you using G-Sync or FreeSync automatic refresh rate change?
3. Do you use the same settings that earl088 used successfully for x3 interpolation?
https://www.svp-team.com/forum/viewtopi … 799#p81799
4. Do you use the "Hardware-accelerated GPU scheduling OFF" setting that aloola suggests for improved performance?
https://www.svp-team.com/forum/viewtopi … 819#p81819
5. Write something more about the 21:9 4k HDR movie you are testing. Is it a 3840x2160 23.976FPS file with black bars at the bottom and top or does it have a different resolution and FPS?
6. Using the file below, can't you interpolate in real time x3?
LG 4K HDR Demo - New York.ts.
File size: 448 MiB
Duration: 1 min 13 sec
Overall bit rate: 51.4 Mbps
HDR format: SMPTE ST 2086, HDR10 compatible
Width: 3 840 pixels
Height: 2 160 pixels
Frame rate: 25.000 FPS.
Color space: YUV
Chroma subsampling: 4:2:0
Bit depth: 10 bits.

Direct link: https://drive.google.com/file/d/1dfR5TT … _bGfEXUvJ/
Source: http://hdr4k.blogspot.com/

At the moment we need to find a way for you to be able to interpolate 4K HDR x3 in real time like others with NVIDIA GeForce RTX 4090 graphics cards.

If that works, then we'll try x4. I've got some new ideas for performance enhancements that haven't come up here yet for nowwink

I just tried it myself and I can't do x3 either with a 4090. I only have DDR4 ram at 3600MHZ, but it doesn't seem to be the problem. On that test video, at 3x interpolation, my 4090 is maxed out at 100% usage. at 2x with the recommended settings, it's at around 70% usage

dawkinscm wrote:
earl088 wrote:

Trying this out on a 2160p Doctor Strange at x4 FPS and the 4090 struggles though it only shows 70-80% GPU usage.



x3 works perfectly though.

How do I get the Tensor Engine option. I've installed the latest update and it's not there.

Edit: Ignore I found it.

Open SVP. Top left open the drop down menu. Utilities>additional programs and features>add or remove components>scroll all the way down. Second one from the bottom is RIFE/TensorRT(beta). Select it and install

25

(1 replies, posted in Using SVP)

This is pretty much exactly what happens to me when playing a video with Dolby Vision. Haven't had any issues with any other video though