| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
- GLMediaPlayer
- Remove State.Stopped and method stop() - redundant, use pause() / destroy()
- Add notion of stream IDs
- Add API doc: State / Stream-ID incl. html-anchor
- Expose video/audio PTS, ..
- Expose optional AudioSink
- Min multithreaded textureCount is 4 (EGL* and FFMPEG*)
- GLMediaPlayerImpl
- Move AudioSink rel. impl. to this class,
allowing a tight video implementation reusing logic.
- Remove 'synchronized' methods, synchronize on State
where applicable
- implement new methods (see above)
- playSpeed is handled partially in AudioSink.
If it exeeds AudioSink's capabilities, drop audio and rely solely on video sync.
- video sync (WIP)
- video pts delay based on geometric weight
- reset video SCR if 'out of range', resync w/ PTS
-
- FramePusher
- allow interruption when pausing/stopping,
while waiting for next avail free frame to decode.
- FFMPEGMediaPlayer
- Add proper AudioDataFormat negotiation AudioSink <-> libav
- Parse libav's SampleFormat
- Remove AudioSink interaction (moved to GLMediaPlayerImpl)
- Tests (MovieSimple, MovieCube):
- Add aid/vid selection
- Add KeyListener for actions: seek(..), play()/pause(), setPlaySpeed(..)
- Dump perf-string each 2s
- TODO:
- Add audio sync in AudioSink, similar to GLMediaPlayer's weighted video delay,
here: drop audio frames.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
available EGL/FFMPeg. WIP!
Off-thread decoding:
If validated (impl) textureCount > 2, decoding happens on extra thread.
If decoding requires GL context, a shared context is created for decoding thread.
API Changes:
- initGLStream(..): Adds 'textureCount' as argument.
- TextureSequence.TexSeqEventListener.newFrameAvailable(..) exposes the new frame available
- TextureSequence.TextureFrame exposes the PTS (video)
Implementation:
- 'int validateTextureCount(int)': implementation decides whether textureCount can be > 2, i.e. off-thread decoding allowed,
default is NO w/ textureCount==2!
- 'boolean requiresOffthreadGLCtx()': implementation decides whether shared context is required for off-thread decoding
- 'syncFrame2Audio(TextureFrame frame)': implementation shall handle a/v sync, due to audio stream details (pts, buffered frames)
- FFMPEGMediaPlayer extends GLMediaPlayerImpl, no more EGLMediaPlayerImpl (redundant)
+++
- SyncedRingbuffer: Expose T[] array
+++
TODO:
- syncAV!
- test Android
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
- AudioSink w/ AudioFrame and formats public
- ALAudioSink uses a circular buffer now, hence relaxes the one-threaded player mode
- FFMPEGMediaPlayer uses multiple audio frames (equal to the ALAudioSink number)
and wraps data to NIO buffer w/o copy.
- FFMPEGMediaPlayer audio threading currently disabled: distorted sound
Seems that the ALAudioSink's circular buffer usage is good enough for now.
- Verbosity only w/ DEBUG flag
- New SyncedRingbuffer for effcient synced buffering
|
|
|
|
| |
implementations.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
General enhancments.
For details about TextureSequence/GLMediaPlayer shader collaboration w/ your own shader source,
see TextureSequence and TexCubeES2 / MovieSimple demo.
TextureSequence allows implementations to provide their own texture lookup function
which may provide color space conversion (YUV) .. or other runtime hw-accel features.
Have a look at the next commit, which provides an Libav/FFMpeg implementation w/ YUV/RGB shader conversion.
MovieCube adds keyboard control (Android: firm touch on display to launch keyboard, don't break it though :)
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Texture
- Add TextureSequence, base interface of GLMediaPlayer to genralize texture streams
- TextureSequence / GLMediaPlayer: Use inner classes for event and texture data
- getLastTexture() shall never return 'null', initialization of TextureSequence (initGLStream(..), etc)
shall provide a TextureFrame w/ the stream's dimension.
- GLMediaPlayerImpl.createTexImageImpl() y-flip defaults to 'false'
impl. shall define y-flip, if required.
- Added MovieCube demo
- Fix Texture: initialize aspectRation for 'wrapping' ctor
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
GLMediaPlayer:
Merging 'initStream()' and 'initGL()' to 'initGLStream()' due to incompatible/buggy implementations (Android/Tegra)
requiring the GL texture being setup before preparing the stream.
This also implies that w/o an GL context we cannot fetch the stream information (size, ..)
hence we need to evaluate this detail (FIXME).
'getNextTexture(GL gl, boolean blocking)' can request the impl. to block
GLMediaEventListener:
The TextureFrame not yet available, adding 'when'
|
|
|
|
|
| |
- Factory falls back to NullGLMediaPlayer allowing to test on platforms where no player is available.
- MovieSimple (c) to JogAmp since it is no more derived from the old project.
|
|
|
|
| |
connected URL. API doc: Useing html table for state chart
|
|
|
|
|
|
|
|
|
| |
initStream(URL) + initGL(GL)) .. IllegalStateException if wrong. Using internet streams of BigBuckBunny, if avail.
- Splitting the initialization in stream and GL allows using the stream information (eg: size, ..)
for setting the GLDrawable properties ..
- Make the impl. more bullet proof ..
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Android API 14
- Introduce states
- Customize / Access texture target,count,features.
- Expose TextureFrame.
- Use 'long' for all time values in msec.
- Mark information optional in API doc (fps, bps, ..)
|
|
Android API 14 MediaPlayer impl of GLMediaPlayer.
Android API 14 MediaPlayer allows usage of OMX AL direct decode to texture via libstagefright (OMX AL usage included).
Status: Untested, not working - Need to fix native OMX IL (stream detect and split) and/or GStreamer implementation.
|