The job consists in:
1) Set up three (3) video streams (ffserver) in a MacPro with OSX El Capitán.
2) Write some PHP code to manage these streams and do some video blending.
- for start/stop video clip in one stream.
- monitoring video output.
In a cabin where an actor performs in front of a green background there are a video camera and a screen monitor. Outside the cabin, an operator in front of server watches what is happening inside and clicks start to a video clip. This video blends (chroma key) with camera video. After 30 seconds, an automatic recording starts during 30 seconds and stops video and saves file.
- Camera stream:
- Video clip stream:
There are 10 different videos (format to be determined) with aprox 60 seconds duration each one. From a HTML page running at localserver, an operator chooses one video and clicks on start so video plays to the end. Video can be stopped. After a few seconds video starts, a recording is perfomed till the video ends. File is saved in some folder with consecutive number.
- Output stream:
This stream always broadcast camera. When operator clicks start, Video clip and camera blend with chroma key filter. This stream has only two connections: one to monitor in room and other as feedback to operator. It will be nice if this feedback is embedded in the HTML page. Monitor in room will be connected to server as a secondary screen so this must be run a full screen player or window.
Latency is critical as performers will be watching feedback monitor. All tweaks to harvest machine power are welcomed.
ffmpeg + ffserver + ffplay 3.2.4 are installed and running.
Although this setting will be on a MAMP server, there will be not internet serving. Only one operator and one event will occur at a time.
Audio will be provided only by video clips.
Video clips format will be set to the best performance achieved/recommended.