7,110 Commits over 3,531 Days - 0.08cph!
Explain that OnTextureData will provide pixel data instead of rendering to texture
Create a 1x1 placeholder texture, swap out with first decoded frame so we always have a texture available. Invoke event when video has loaded. Add Width and Height of video.
Add VideoPlayer.Play that takes a filesystem
New ffmpeg video player
Delete old libav binaries
Add VideoPlayer IsPaused and TogglePause
Add example videoplayers for native rendering and pixmap rendering
Use BGRA for video filter if the player wants pixel data
Pass video frame data to managed so pixmap can be created when texture isn't ideal to use in Qt
Adjust frame queue size, using way too much memory
Read packets, decode audio and video in separate threads, sync video to external clock
Allow videos to repeat or not
Break out of audio and video threads when there's no more packets to read
Try to keep video in sync
Free audio output on video stop
Get rid of video64.dll usage, we're in control now
Attempt to decode audio but doesn't sync yet
Take off decoding time from sleep time
Test video decoding in a threaded job
Stop decoding and wait till job has finished when video stops
Initial support for playing video
Bind resolve shaders and textures directly instead of going through renderable pass, avoids it being used across frame boundaries
Initial support for audio playback
Get rid of event wait shit for audio, ExecuteOnMainThread handles this
Support seeking
Fix streaming from url not working
Init, shutdown and run frame for video playback in a service so we don't have to do it manually
Give video player the managed object so we can call back to it
Tell managed when texture has been created, we don't know until first frame has been decoded
Bind all the functions of video player
Pass video events down to managed
Add Duration and CurrentPlaybackTime
Documentation
Call OnTextureCreated after new texture has been rendered to
Add clear to audio stream interface, clear the resample buffer instead of having to recreate the whole stream
Add use_doppler sound op to stack, disable it on attached streams (doppler is no good on streams because it pitches)
Start allowing video audio stream to attach to sound events to change how audio is played
Allow audio stream to be attached to multiple sound events
Add SoundStream.Play (returns a sound handle, can be called multiple times), obsolete Sound.CreateStream (this was all ass backwards)
Add VideoPlayer.PlayAudio, called when audio is ready, can be called multiple times for multiple sound sources
Add public constructor for SoundStream, this is allowed now it doesn't depend on having a Sound
Add more properties to video player
Turns out sound stream can't have multiple sound sources, at least for now
Add qt video widget example https://files.facepunch.com/layla/1b2911b1/sbox_tueAgFGKr8.mp4
Remove ui test that doesn't exist
Remove libogg, libvorbis, libpvx, all unused
Add ffmpeg to thirdparty
video.vpc
Force push ffmpeg dlls
video app system bullshit
Force push ffmpeg libs
Test reading a single frame directly with ffmpeg
network init on video system init
network init on video system init
Test reading a single frame directly with ffmpeg
video app system bullshit
video.vpc
Force push ffmpeg dlls
Remove libogg, libvorbis, libpvx, all unused
Initial support for playing video
Bind resolve shaders and textures directly instead of going through renderable pass, avoids it being used across frame boundaries
Initial support for audio playback
Get rid of event wait shit for audio, ExecuteOnMainThread handles this
Support seeking
Fix streaming from url not working
Init, shutdown and run frame for video playback in a service so we don't have to do it manually
Give video player the managed object so we can call back to it
Tell managed when texture has been created, we don't know until first frame has been decoded
Bind all the functions of video player
Pass video events down to managed
Add Duration and CurrentPlaybackTime
Documentation
Call OnTextureCreated after new texture has been rendered to
Add clear to audio stream interface, clear the resample buffer instead of having to recreate the whole stream
Add use_doppler sound op to stack, disable it on attached streams (doppler is no good on streams because it pitches)
Start allowing video audio stream to attach to sound events to change how audio is played
Allow audio stream to be attached to multiple sound events
Add SoundStream.Play (returns a sound handle, can be called multiple times), obsolete Sound.CreateStream (this was all ass backwards)
Add VideoPlayer.PlayAudio, called when audio is ready, can be called multiple times for multiple sound sources
Add public constructor for SoundStream, this is allowed now it doesn't depend on having a Sound
Add more properties to video player
Turns out sound stream can't have multiple sound sources, at least for now
Add qt video widget example https://files.facepunch.com/layla/1b2911b1/sbox_tueAgFGKr8.mp4
Add qt video widget example https://files.facepunch.com/layla/1b2911b1/sbox_tueAgFGKr8.mp4
Turns out sound stream can't have multiple sound sources, at least for now
Add more properties to video player
Add public constructor for SoundStream, this is allowed now it doesn't depend on having a Sound
Allow audio stream to be attached to multiple sound events
Add SoundStream.Play (returns a sound handle, can be called multiple times), obsolete Sound.CreateStream (this was all ass backwards)
Add VideoPlayer.PlayAudio, called when audio is ready, can be called multiple times for multiple sound sources
Add use_doppler sound op to stack, disable it on attached streams (doppler is no good on streams because it pitches)
Start allowing video audio stream to attach to sound events to change how audio is played
Add clear to audio stream interface, clear the resample buffer instead of having to recreate the whole stream
Documentation
Call OnTextureCreated after new texture has been rendered to
Bind all the functions of video player
Pass video events down to managed
Add Duration and CurrentPlaybackTime
Tell managed when texture has been created, we don't know until first frame has been decoded
Give video player the managed object so we can call back to it
Init, shutdown and run frame for video playback in a service so we don't have to do it manually
Fix streaming from url not working
Get rid of event wait shit for audio, ExecuteOnMainThread handles this
Initial support for audio playback
Range check Model.GetMaterials, Add Model.Materials to iterate through all mesh materials
Add Model.GetMaterialGroupIndex and Model.GetMaterials (from group index or name)
ShaderGraph: Fix formatting on generated code
Bind resolve shaders and textures directly instead of going through renderable pass, avoids it being used across frame boundaries
Initial support for playing video
ShaderGraph: Move graph serialization to it's own file
ShaderGraph: Refactor codegen to keep it self contained in the compiler
ShaderGraph: Give pastes an ident so people know what it is
ShaderGraph: Copy pastes now use a gzip with base64 to make them easier to paste to people
ShaderGraph: Refactor NodeResult
Allow underscores in parameter names
Refactor writing parameter options
Add bool IsAttribute to Parameter and Texture nodes
If true, `Attribute( "name" )` option will be added
Don't include UI options in generated code when using attributes (attributes don't show in the UI anyway)