| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
| |
and subtitle language
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
(PGS ..) via FFMPEGFMediaPlayer/FFmpeg
FFMPEGFMediaPlayer related changes:
- Add libswscale (6th FFmpeg lib used) for sws_getCachedContext(), sws_scale() and sws_freeContext(),
used natively to convert the palette'ed bitmap into RGBA colorspace -> GL texture
- Handling AVSubtitleRect.type SUBTITLE_BITMAP
-- only handled if libswscale is available
-- config/adjust texture object
-- sws_scale palette'ed bitmap to texture
-- intermediate memory is cached, may be resized and free'ed at destroy
-- texture objects are managed and passed from GLMediaPlayerImpl,
as they are also forwarded to player client via SubBitmapEvent
- Passing the AVCodecID to GLMediaPlayerImpl, converted to our CodecID enum.
- Unifying creation and opening of AVCodecContext with 'createOpenedAVCodecContext(..)'
+++
SubtitleEvent*
- SubTextEvent now also handles ASS.Dialogue (FFmpeg 4)
besides ASS.Event (FFmpeg 5, 6, ..).
+++
GLMediaPlayerImpl
- Added ringbuffer subTexFree, managing Texture for bitmap'ed subtitles
-- Uses 1 bitmap-subtitle Texture per used textureCount in cache,
as one bitmap-subtile can be displayed per frame.
Could be potentially reduced to just 2 .. but resources used are
relatively low here.
- Validating subTexFree + videoFramesFree usage,
use blocking get/put ringbuffer due to utilization from different threads.
- Receives subtitle content from native getNextPacket0() via callback,
creates SubtitleEvent instance and passes it to a SubtitleEventListener - if exists.
(See MediaButton example)
-- SubBitmapEvent also gets its special SubBitmapEvent.TextureOwner to handle client releasing
the event and allowing us to put back the Texture resource to 'subTexFree'.
This passing through of the Texture object is probably a weakness of this lifecycle
and requires the client to ensure SubtitleEvent.release() gets called.
See MediaButton example!
- Exposing CodecID, allowing clients like MediaButton to handle SubtitleEvent content according to codec
|
|
|
|
|
|
|
| |
func-ptr in native; readNextPacket0() passes video+subtitle texTarget and texID
For bitmap subtitles we need to push the bitmap into its own texture.
Hence readNextPacket0() must switch to used texture using glEnable() on !core and glBindTexture().
|
|
|
|
|
|
|
|
| |
streams and languages and add convenient switchStream(..) entry.
audio/video/subtitle streams and language metadata is maintained by arrays holding the stream-IDs and language string identifier.
Implementation added in FFMPEGPlayer for these data-sets.
|
|
|
|
|
|
|
|
| |
com.jogamp.common.av.PTS.millisToTimeStr(..)
Chapter metadata is now supported via our FFMPEGMediaPlayer implementation.
Added public method: 'Chapters[] GLMediaPlayer.getChapters()'
|
|
11), 5.* (Debian 12) and 6.* (Current Development trunk)
From here on, libav support has been dropped.
Required FFmpeg libraries to be fully matched by their major runtime- and compiletime-versions are:
- avcodec
- avformat
- avutil
- swresample
Library avdevice is optional and only used for video input devices (camera).
Library avresample has been removed, since FFmpeg dropped it as well in version 6.*
and swresample is preferred for lower versions.
The matching major-versions of each library to the FFmpeg version
is documented within FFMPEGMediaPlayer class API-doc.
Each implementation version uses the non-deprecated FFmpeg code-path
and compilation using matching header files is warning-free.
|