4,339 Commits over 1,552 Days - 0.12cph!
Remove texture animation from list when texture isn't being used anymore
Add super simple ui test that shows an animated image
Make Texture.Animation a class not a struct
VideoPlayer: Refactor to remove audio filtering bullshit, resample at
44100 2 channels
Pass in video ext from managed so we can check that for input format
Image loader creates textures for multiple frames if they exist. Whitelist gif and webp. Swap out texture handles for the current frame. https://files.facepunch.com/layla/1b0711b1/sbox-dev_rCZwXFIzD1.mp4
Avoid threading assert in audio thread
VideoPlayer: Fix repeat not working when user doesn't output sound on a video with sound
Fix video player not playing from filesystem path
Whitelist mp4 and webm to avoid going any further than we have to
Limit input formats to mp4 and webm
VideoPlayer.Play that takes a url checks if it's actually a url
Modeldoc: Compile skeleton with ALL bones no matter if they're used in skinning or not
Fix func_voxelsurface to the new way of getting the surface face
ShaderGraph: Fix formatting on generated code
Add blendmodes.hlsl until shadergraph just generates these functions
ShaderGraph: Fix NRE when identing generated code
procedural.hlsl isnt in common anymore
ShaderGraph: Only find nodes derived from ShaderNode
ShaderGraph: common/proceedural ->common/procedural
Video player (#1050)
VideoPlayer class that decodes mp4 and webm from paths or urls and provides the texture and audio.
Initial support for playing video
Bind resolve shaders and textures directly instead of going through renderable pass, avoids it being used across frame boundaries
Initial support for audio playback
Get rid of event wait shit for audio, ExecuteOnMainThread handles this
Support seeking
Fix streaming from url not working
Init, shutdown and run frame for video playback in a service so we don't have to do it manually
Give video player the managed object so we can call back to it
Tell managed when texture has been created, we don't know until first frame has been decoded
Bind all the functions of video player
Pass video events down to managed
Add Duration and CurrentPlaybackTime
Documentation
Call OnTextureCreated after new texture has been rendered to
Add clear to audio stream interface, clear the resample buffer instead of having to recreate the whole stream
Add use_doppler sound op to stack, disable it on attached streams (doppler is no good on streams because it pitches)
Start allowing video audio stream to attach to sound events to change how audio is played
Allow audio stream to be attached to multiple sound events
Add SoundStream.Play (returns a sound handle, can be called multiple times), obsolete Sound.CreateStream (this was all ass backwards)
Add VideoPlayer.PlayAudio, called when audio is ready, can be called multiple times for multiple sound sources
Add public constructor for SoundStream, this is allowed now it doesn't depend on having a Sound
Add more properties to video player
Turns out sound stream can't have multiple sound sources, at least for now
Add qt video widget example https://files.facepunch.com/layla/1b2911b1/sbox_tueAgFGKr8.mp4
Remove ui test that doesn't exist
Remove libogg, libvorbis, libpvx, all unused
Add ffmpeg to thirdparty
video.vpc
Force push ffmpeg dlls
video app system bullshit
Force push ffmpeg libs
Test reading a single frame directly with ffmpeg
network init on video system init
Test video decoding in a threaded job
Stop decoding and wait till job has finished when video stops
Take off decoding time from sleep time
Attempt to decode audio but doesn't sync yet
Get rid of video64.dll usage, we're in control now
video -> videosystem
Free audio output on video stop
Try to keep video in sync
Break out of audio and video threads when there's no more packets to read
Read packets, decode audio and video in separate threads, sync video to external clock
Allow videos to repeat or not
Adjust frame queue size, using way too much memory
Pass video frame data to managed so pixmap can be created when texture isn't ideal to use in Qt
New ffmpeg video player
Delete old libav binaries
Add VideoPlayer IsPaused and TogglePause
Add example videoplayers for native rendering and pixmap rendering
Use BGRA for video filter if the player wants pixel data
Add VideoPlayer.Play that takes a filesystem
Create a 1x1 placeholder texture, swap out with first decoded frame so we always have a texture available. Invoke event when video has loaded. Add Width and Height of video.
Explain that OnTextureData will provide pixel data instead of rendering to texture
Explain that OnTextureData will provide pixel data instead of rendering to texture
Create a 1x1 placeholder texture, swap out with first decoded frame so we always have a texture available. Invoke event when video has loaded. Add Width and Height of video.
Add VideoPlayer.Play that takes a filesystem
New ffmpeg video player
Delete old libav binaries
Add VideoPlayer IsPaused and TogglePause
Add example videoplayers for native rendering and pixmap rendering
Use BGRA for video filter if the player wants pixel data
Pass video frame data to managed so pixmap can be created when texture isn't ideal to use in Qt
Adjust frame queue size, using way too much memory
Read packets, decode audio and video in separate threads, sync video to external clock
Allow videos to repeat or not
Break out of audio and video threads when there's no more packets to read
Try to keep video in sync
Free audio output on video stop
Get rid of video64.dll usage, we're in control now
Attempt to decode audio but doesn't sync yet
Take off decoding time from sleep time
Test video decoding in a threaded job
Stop decoding and wait till job has finished when video stops
Initial support for playing video
Bind resolve shaders and textures directly instead of going through renderable pass, avoids it being used across frame boundaries
Initial support for audio playback
Get rid of event wait shit for audio, ExecuteOnMainThread handles this
Support seeking
Fix streaming from url not working
Init, shutdown and run frame for video playback in a service so we don't have to do it manually
Give video player the managed object so we can call back to it
Tell managed when texture has been created, we don't know until first frame has been decoded
Bind all the functions of video player
Pass video events down to managed
Add Duration and CurrentPlaybackTime
Documentation
Call OnTextureCreated after new texture has been rendered to
Add clear to audio stream interface, clear the resample buffer instead of having to recreate the whole stream
Add use_doppler sound op to stack, disable it on attached streams (doppler is no good on streams because it pitches)
Start allowing video audio stream to attach to sound events to change how audio is played
Allow audio stream to be attached to multiple sound events
Add SoundStream.Play (returns a sound handle, can be called multiple times), obsolete Sound.CreateStream (this was all ass backwards)
Add VideoPlayer.PlayAudio, called when audio is ready, can be called multiple times for multiple sound sources
Add public constructor for SoundStream, this is allowed now it doesn't depend on having a Sound
Add more properties to video player
Turns out sound stream can't have multiple sound sources, at least for now
Add qt video widget example https://files.facepunch.com/layla/1b2911b1/sbox_tueAgFGKr8.mp4
Remove ui test that doesn't exist
Remove libogg, libvorbis, libpvx, all unused
Add ffmpeg to thirdparty
video.vpc
Force push ffmpeg dlls
video app system bullshit
Force push ffmpeg libs
Test reading a single frame directly with ffmpeg
network init on video system init
network init on video system init
Test reading a single frame directly with ffmpeg
video app system bullshit
video.vpc
Force push ffmpeg dlls
Remove libogg, libvorbis, libpvx, all unused
Initial support for playing video
Bind resolve shaders and textures directly instead of going through renderable pass, avoids it being used across frame boundaries
Initial support for audio playback
Get rid of event wait shit for audio, ExecuteOnMainThread handles this
Support seeking
Fix streaming from url not working
Init, shutdown and run frame for video playback in a service so we don't have to do it manually
Give video player the managed object so we can call back to it
Tell managed when texture has been created, we don't know until first frame has been decoded
Bind all the functions of video player
Pass video events down to managed
Add Duration and CurrentPlaybackTime
Documentation
Call OnTextureCreated after new texture has been rendered to
Add clear to audio stream interface, clear the resample buffer instead of having to recreate the whole stream
Add use_doppler sound op to stack, disable it on attached streams (doppler is no good on streams because it pitches)
Start allowing video audio stream to attach to sound events to change how audio is played
Allow audio stream to be attached to multiple sound events
Add SoundStream.Play (returns a sound handle, can be called multiple times), obsolete Sound.CreateStream (this was all ass backwards)
Add VideoPlayer.PlayAudio, called when audio is ready, can be called multiple times for multiple sound sources
Add public constructor for SoundStream, this is allowed now it doesn't depend on having a Sound
Add more properties to video player
Turns out sound stream can't have multiple sound sources, at least for now
Add qt video widget example https://files.facepunch.com/layla/1b2911b1/sbox_tueAgFGKr8.mp4
Add qt video widget example https://files.facepunch.com/layla/1b2911b1/sbox_tueAgFGKr8.mp4
Turns out sound stream can't have multiple sound sources, at least for now
Add more properties to video player
Add public constructor for SoundStream, this is allowed now it doesn't depend on having a Sound