Initial support for playing video
Bind resolve shaders and textures directly instead of going through renderable pass, avoids it being used across frame boundaries
Initial support for audio playback
Get rid of event wait shit for audio, ExecuteOnMainThread handles this
Support seeking
Fix streaming from url not working
Init, shutdown and run frame for video playback in a service so we don't have to do it manually
Give video player the managed object so we can call back to it
Tell managed when texture has been created, we don't know until first frame has been decoded
Bind all the functions of video player
Pass video events down to managed
Add Duration and CurrentPlaybackTime
Documentation
Call OnTextureCreated after new texture has been rendered to
Add clear to audio stream interface, clear the resample buffer instead of having to recreate the whole stream
Add use_doppler sound op to stack, disable it on attached streams (doppler is no good on streams because it pitches)
Start allowing video audio stream to attach to sound events to change how audio is played
Allow audio stream to be attached to multiple sound events
Add SoundStream.Play (returns a sound handle, can be called multiple times), obsolete Sound.CreateStream (this was all ass backwards)
Add VideoPlayer.PlayAudio, called when audio is ready, can be called multiple times for multiple sound sources
Add public constructor for SoundStream, this is allowed now it doesn't depend on having a Sound
Add more properties to video player
Turns out sound stream can't have multiple sound sources, at least for now
Add qt video widget example https://files.facepunch.com/layla/1b2911b1/sbox_tueAgFGKr8.mp4
Remove ui test that doesn't exist
Remove libogg, libvorbis, libpvx, all unused
Add ffmpeg to thirdparty
video.vpc
Force push ffmpeg dlls
video app system bullshit
Force push ffmpeg libs
Test reading a single frame directly with ffmpeg
network init on video system init
Test video decoding in a threaded job
Stop decoding and wait till job has finished when video stops
Take off decoding time from sleep time
Attempt to decode audio but doesn't sync yet
Get rid of video64.dll usage, we're in control now
video -> videosystem
Free audio output on video stop
Try to keep video in sync
Break out of audio and video threads when there's no more packets to read
Read packets, decode audio and video in separate threads, sync video to external clock
Allow videos to repeat or not
Adjust frame queue size, using way too much memory
Pass video frame data to managed so pixmap can be created when texture isn't ideal to use in Qt
New ffmpeg video player
Delete old libav binaries
Add VideoPlayer IsPaused and TogglePause
Add example videoplayers for native rendering and pixmap rendering
Use BGRA for video filter if the player wants pixel data
Add VideoPlayer.Play that takes a filesystem
Create a 1x1 placeholder texture, swap out with first decoded frame so we always have a texture available. Invoke event when video has loaded. Add Width and Height of video.
Explain that OnTextureData will provide pixel data instead of rendering to texture