A somewhat comprehensible guide to using Elphel as a Digital Cinema camera


Research published between June and August, 2012.
Audio research published on February, 2013.

The objetive of this article is to be a comprehensible, easy-to-recall guide for those who wish to use Elphel as a Digital Cinema camera. It aims to cover the major topics in the general use of Elphel – the common issues one may cross on its way, the usual doubts one may have before and after having access to this beautiful camera.

The topics are:

  • Introduction
  • Chosing lens – it’s a Super 8 Digital Cinema camera!;
  • Connecting to the camera;
    • An important note on setting FPS;
    • ElphelVision;
    • The ‘getting constantly disconnected’ issue fix;
  • Streaming;
  • Recording;
  • Building a wise setup to monitor your RAW recording;
    • A note on GStreamer and the upside-down Elphel;
    • The x11vnc how-to;
  • Editing;
  • Focus pulling;
    • Putting the focus to the test;
    • Full-frame as resource to focusing;
  • Effective recording quality;
    • The deal with Elphel and CF cards;
    • Recording with bugged CF Cards;
    • Recording with an external SATA drive;
    • What about JP4?;
    • Summing up;
    • Shutter Speed and Exposure time cheatsheet;
  • Audio recording.

It must be said that the reason this article is being written is that we are a team of filmmakers who want to do Digital Cinema our way. We are researching all this in order to shoot a film the way we want to: the open sourced way. Our movie is called Floresta Vermelha, and you can find more information about it in our production blog.

There are many like us, and we hang out at Apertus international community website. Also, the only way all this would be possible – both shooting our film and writing this article – is due to Elphel’s program called the Elphel Camera Development Pool, in which we participate.

Elphel has most kindly lent one of their cameras to our project in exchange for this research and documentation, and for the documentation on how to improve editing/post producing a RAW workflow using scripts in linux.

The author of this blog would like to personally thank Sebastian Pichelhofer, Olga Filippova, Andrey Filippov, Oleg Dzhimiev and Nathan Clark, all from Elphel and/or Apertus teams. Many of the information contained here is based on their help and original documentation – whenever that was the case, the links refer to the original source. I have also tried to improve Elphel’s wiki original texts prior to writing them here, because they not only are where people will look first but also the more trustful source in whatever relates to Elphel cameras use.


(the above Apertus logo is a mashup of the original work by Nathan Clark and Sasha Cohen for the community – it is therefore not the official logo and must be used with that discretion)

Chosing lens – It’s a Super 8 Digital Cinema camera!

If you are planning to use Elphel to shoot a Digital Cinema movie, the first thing that will come to your mind after you read a lot about it on the internet is: “Oh, well, it’s a Super 8 Digital Cinema camera!”. That’s right, this will sum it all.

Take a look at the image above. There you can see the size of a Super 8 film on the left, and the size of na Elphel Aptina sensor on the bottom-right. On the upper-right, there is a juxtaposition of both: the sensor and the film dimensions. You will notice the film is slightly larger and the sensor slightly taller, but they are about the same.

Elphel’s sensor, if used full frame, has a resolution of 2592×1944 pixels. It is, therefore, larger than full HD. Also on the image above, you will notice a purple rectangle representing the frame if we were recording full HD with Elphel – it is important that you understand the image being captured in that example (the purple rectangle) is just part of our whole image (the whole image would be the full frame). It is important because this is the most common scenario we will have to work with.

If you record full-frame with Elphel, at a RAW format called JP4, the maximum framerate you will achieve is 15 FPS. Super 8 cameras used to shoot at 18 FPS. But, as we will see in throughout this document, we will normally use a RAW format called JP46 that achieves lesser framerates. In order to have 24 FPS, we will deliberately choose part of the sensor as our frame, resulting in lesser resolutions.

You will have to consider that when choosing your lens.

Elphel cameras can use both C-mount or CS-mount lenses, since they come with an adapter. You can see the lenses that typically come with the camera at Elphel’s site and you can take a look at Apertus community’s page on this issue for good information. The most important piece of information, though, can be found at this discussion at Apertus’ forums, led by member Nathan Clark.

The author of this article lives in Brazil, where this kind of nomenclature is rare. The chance of buying a lens here knowing its mount type means nothing. It does help, however, to know that you can use screw mount lenses made for 16mm or Super 8mm films.

I ended up finding an old Fujica Single-8 camera from the 70ies (the Single-8 format is equivalent to Super 8). I love old lenses and I think it was a great buy! Of course, at the section “Focus pulling” down below I take a closer look at how that worked out for me. At our production blog, I will also post some test footage shot with that lens, to give a visual idea.

One of my major doubts when looking for a lens was also its relationship to the 35mm film format. This apprehension is simply explained. The Fujinon MA-Z I was buying was a 7.5mm–75mm lens. I noticed the lenses that usually come with Elphel are 4mm-8mm or 4.5mm-13.2mm. In still photography (35mm), we a usual set of lenses include a 24mm, a 35mm and a 50mm.

In other words, I wanted to have an idea of what I was going to see with Elphel when using that lens. But the camera was still to be shipped and I only had my photographic camera!

Some ways to get to that result, beyond the links I mentioned above, were a visual field of view comparator, the idea of the ‘normal lens’, as described at Wikipedia and even the DOFView free software (GPL), that calculates this for you.

When the camera finally arrived, I was able to peep into both and decided that even though the formulas told me a 7.5mm lens is the equivalent to a 40mm in my 35mm camera, it seemed visually more precise to say it was the equivalent to a 35mm lens.

Please remember how we started this section. You will most probably not use Elphel’s full sensor to capture the images for your movie (remember the purple rectangle). So you have to consider how much of the sensor area you will actually use and only then you will have an idea of what you are going to have on camera.

For our film, we are considering using the maximum of the frame horizontally and calculate the vertical size by using the cinema standard of 2.39:1 aspect ratio. We have then to see if Elphel’s processor can handle this resolution at 24 frames per second in JP46 RAW mode.

Connecting to the camera

The Elphel camera is accessed and controlled via LAN cable. The documentation on how to do this step properly can be found at Elphel’s wiki, on this page. I will detail here some of the most important parts of its interface in order to get you easily situated.

This is Elphel’s main interface. It is the “Welcome aboard” screen, the first thing you will see when you type its IP address at your web browser. It shows a still live image as it’s being seen by the camera on the right and presents some links to dig further.

The first link, “Camera Control Interface”, is the major interface for controlling. By clicking on it, you will see something like the image below.

Here we can see the live display (stream) of the image, as seen by the camera, some controls on the upper-left part and a WOI, a “window of interest” at the center of the image. This is the area Elphel uses to help you focus the image.

In our case, for shooting Digital Cinema, we may benefit from disabling the automatic processes related to Autoexposure, Auto White Balance and so on, leaving the camera completely manual. We’ll see more from that later.

In case you are wondering, yes, this is a page of the brilliant book Photographs, by Mona Kuhn. If you are a photographer, then take some time to look at her pictures.

For now, we want to click on the first button, the camera icon, at the extreme upper-left of our image.

The button brings us the camera controls. Of course, we can have everything open at the moment of recording, but usually we will isolate this area to be used together with a terminal. Use the image above as a guide for the following text:

1. We can see that I am currently using a frame size of 2240×976, at 23.986 FPS (more information on how to set these parameters below). This part also shows me my frames have 481K in size, so at 24 FPS, I shall have 11,544 MB/s of data transfer. Please consider that this frame size may vary during recording, due to change of light or movement in the scene. At the section “Effective recording quality”, we can see detailed explanations on the implications of this information.

2. This is the live histogram of our image. The same stream we see big behind these controls, we can see smaller here, on the grey box. This is called the WOI, or “window of interest”, in the camera documentation. The grid behind the smaller stream (WOI) is the equivalent to Elphel’s full frame, 2592×1944. Since we’re not using full frame, the image we see here is positioned in relation to which part of the frame we are actually using (in my case, the central part, but you can use any other you want). The orange rectangle currently at the center of the stream is the area used to calculate the incidence of light on the subject – therefore, it is this rectangle that is responsible for the histogram.

3. This lesser part we use a lot. This is where you control shutter speed, black level, gain, saturation and so on. At this link of Elphel’s wiki we also have a description of this region.

An important information is that the last button on the upper menu, the icon that show an interrogation mark, can be pressed at any time. It will bring a dialog box with information when you navigate with the mouse over the sections or buttons of this interface.

Also, the shutter speed in Elphel is informed by milliseconds. To make things easier for those used to the photographic approach, in fractions of second, I’ve made a equivalence cheatsheet that can be seen on the bottom of this article.

To go on, let’s click on “More details” (number 4 of the image above).

This brings us a series of tabs. The most important of them we’ll see now. The first one is tab 5, that allows us to set the JPEG Quality of our recording (96% here), if the image is flipped (in this case, it is, since I’ve mounted my camera upside down, due to the lens), and the recording format.

About the recording format: Elphel records on three main formats. The first, “color”, shown on the image above, is a sequence of JPEG images in a movie container (mov or ogm). There is “JP46”, which is a RAW format and the one we’ll normally use. It records a sequence of RAW images before the process of debayering, so they must necessarily be post-processed. To have a live, colored stream of this video, you must use a program called GStreamer. Finally, there is “JP4”, which is also a RAW format, but it skips some of the camera internal processes, allowing you to achieve higher FPS at the cost of not being able to have a live stream to monitor what you are recording.

Please, take a look at this excellent video by Sebastian Pichelhofer, that explains this issue way better than I am able to and with subtitles to many languages.

Here, I changed to the “JP46” format. Notice the image now is black-and-white and full of squares. Also, up above, you can see the frame size dropped from 481K to 401K, which is something to be noted – we can record at higher quality using the RAW format.

Continuing the tab-tour, here we can see the second important tab we’ll use. It is important because it is here that we can disable the big stream of our image (at “Display mode”), leaving us just with these controls, which is what we want. We will typically use the window with only the controls together with a terminal or a second instance of the web browser.

Let’s go back to the entrance. Here, we still want to explore to links, the “Disk recorder” area and the “Parameter Editor” area.

Here, we see the interface for “Disk recorder” as it is opened. On the left there is a place that shows the media mounted by Elphel – this can be a Compact Flash card or an external HDD drive. Once you have recorded videos or some folders on your media, this is the place they will show up.

On the right, we can see some tabs. Currently, it is at the “status” tab. This tab is used mostly if you use this interface to record the movies. You would press the large “Record” button above the tabs and some information would start being fed here; since this works with php, a php process would update this info every 2 seconds in order to keep you updated about the status of your doings. You can notice we already have some info on frame size, compression and framerate being displayed.

Here, we have done two things. First, we have opened the “Format” tab. This is the place we chose whether we want to record in a MOV or OGM container (the first is preferred, since it uses less CPU power) and where we can mount the external media.

If we mount the external media from here, it will be shown on the box at the left, as we have mentioned. However, you may want, for some reason, to mount the media via command line. What Elphel does is it creates a hdd directory at /var, then mounts the media there and then creates a webshare. In the terminal, it would be something like this:

mkdir /var/hdd
mount /dev/hda1 /var/hdd
ln -s /var/hdd /mnt/flash/html/hdd

Special thanks to Sebastian Pichelhofer for the above information.

In case you mount your media to another place, say /mnt/0, you’d have to update the above so that this interface is able to see the files.

The second thing we have changed from the last image is that we have clicked on the “show buffer” link above the “Record” button. It opened a progress bar that indicates the amount of buffer being used when recording.

This part works like this: say you are recording at low resolution. As consequence, you’d have low data rate being transferred from the camera to your media (the CF card or external HDD). Then, this progress bar will always be ok, it will hardly show Elphel is using its buffer.

Now, say you are recording at very high resolution, as is our case here – we have a very large frame size and high framerate and compression. If I pressed record, the camera would have difficulties in doing all this stuff together – being able to process such large image in such short time, use the php to update this information at the progress bar, recording in the media and (usually) feeding a live stream of the video. It will be too much.

Whenever a circumstance like this happens, the camera starts buffering the images to see if it can compensate and record them at the media as soon as it can. If it can’t, and in our last case it can’t, it will have to drop frames. And Elphel, once a frame is dropped, closes a .mov file with what it was able to record, starts another recording from zero and outputs a message of “buffer overrun”.

For Digital Cinema, we will not use this interface to record because we do not want the php process it requires to demand more processing power from an already loaded camera CPU. But you may want to use it for lower resolutions, such as DV.

It has to be mentioned that you can “Play” your recorded files using this interface. You just chose the video from the list and click on it, as can be seen on the image above. The video is like that because I am recording in RAW, so what you see is the video before debayering.

This image shows the last very useful tab, “Advanced”, where you can enlarge the maximum size and length of your videos. This is a must for Digital Cinema, since a typical short scene will have more than 200 MB and can have more than a minute in considerable cases. Just put a wild number here so that you won’t have to worry about this anymore.

The image above also shows a typical use of Elphel’s interface. You’d have the first instance of the web browser showing Elphel’s controls on the right and something else on the left – in our case, the windows will typically be a terminal, an instance of GStreamer showing a live stream of the video and some text cheatsheets on the back.

This is the “Parameter Editor”. Together with the “Camera Control Interface” (the camera controls on the right), it is the most important set of controls to Elphel. This is because pretty much everything you want to change can be found here: framerate, frame size, which part of the sensor you want to use, which kind of format you want to record (Color, JP46, JP4) and so on.

As you can see on the lower part, here you can also save some settings so that the camera recognizes your preferences when initialized (booted). In this case, I have been testing recording at the highest possible resolutions to see what is the real maximum frame size I can use when recording at 24 FPS and JP46 RAW.

To change the parameters, the easiest way is to press the “Select All” button on the top-left and then click on the link “View / Edit Current”. It will give you the interface below.

There they are. All the parameters you can change. When you pass the cursor over the parameters, Elphel gives you a dialog box describing them briefly. One of the things you will definitely want to change for Digital Cinema will be the “WOI”, or “window of interest”. It corresponds to our frame size. Note that once you change it, you won’t be using the whole sensor. To centralize the area of your WOI (meaning, to use the central area of the sensor instead of the corners), you have to change the parameters “WOI_TOP” and “WOI_LEFT”.

An important note on setting FPS

The best link that describes how to set the framerate is this page at Elphel’s wiki. As you can see there, if you want a full-number FPS such as 24, 30 or so on, you have to use the “Triggered Mode”. The “Free Running Mode” typically gives you framerates of 23.998 or 23.866.

However, when using the camera during a recording in the field, I noticed that it was very hard to stick to the “Triggered Mode”. I’ll explain.

In the camera controls, we see this kind of information in the display:

2240 x 976 @ 23.986 fps ----- 481K

The framerate part is green when everything is OK but changes color when it’s not.

Whenever I would fix the FPS using the “Triggered Mode” at such high resolutions, it was tough making the number stay green. I would have to stick to a fixed Exposure (shutter speed) and mark the framerate to stay fixed in one of the controls. Now, even though the number kept green that way, I was always unsure it was really ok, because then the number would sort of stick to green. I noticed the videos recorded that way also presented lesser framerates when analysed with FFMPEG.

I never had this problem when using the “Free Running Mode”. Even though it results in a fractioned framerate, it is very easy to know when things are going wrong, because the color will easily change. If you change the Exposure, for example, it may change depending on the value, making clear things are not ok. The videos I recorded this way presented a steady FPS, the same value as I had set and was being shown on the display.

Of course, please check this procedure yourself before simply trusting my words.


All of the controls described above can be done with a very nice interface developed by Sebastian Pichelhofer called ElphelVision. In the section “Effective recording quality”, I describe why I’m not using it, but it is definitely worthy taking a look at because it is quite suitable for Digital Cinema with Elphel.

The two most important links are the Install Guide and the ElphelVision User Guide.

The ‘getting constantly disconnected’ issue fix

For my setting, I am using two computers. The first one is a Desktop computer, an i7 with two cable network interfaces and one wi-fi network interface. I use it to edit my videos, but I also used it to do some tests with Elphel. It runs on Debian Stable. The second one is the field computer, an old and recycled Pentium M laptop dated from 2005, that has one cable network interface and one wi-fi network interface. It runs on Debian Testing.

I mention this because it may be worthy to know how I can access Elphel from them, and how I solved an issue that showed up with the laptop – I was getting constantly disconnected from Elphel’s network.

On the Desktop:
If I access Elphel via router, I have no problems whatsoever, it just connects. If I access it directly (with no router), I have to shut down all my network interfaces – eth0, eth1 and wlan0 – using “ifconfig [interface] down” for each one of them. Then, I run the same command as I’d use if connecting it via router and things go just fine. I use Gnome, and it does bring my wlan0 interface up automatically after I do that, but I cannot use internet any more.

On the laptop:
Things here occur a bit different. I am not an expert on networks, so I am not able to bring a network up out of the blue. So after many experiments, I decided the best thing to do with the laptop was to configure network-manager in Gnome to auto-connect to the cable interface when booting, using a manual IP (see image below).

This means I must have the camera cable plugged in when booting the laptop. Only then it brings me the eth0 up I need successfully (otherwise, I can even bring it up with ifconfig, but the lights of the camera won’t turn on and I won’t be able to connect to it).

I also noticed that even after successfully connecting, I would be disconnected constantly while using Elphel. This is not a problem of the camera. Network-manager tries to connect to my wi-fi network and drops the cable down. It must be mentioned that I also have wicd installed at the laptop, because I use Enlightenment when not using it for recording with Elphel.

The only way I was able to solve this problem was to create a cable network with fixed IP in network-manager and ask the program to auto-connect to it when booting. This way, things get ok, and nm stops discarding my cable connection to prioritize the wi-fi.


For better understanding, this section should be read together with the section “Recording”, right below it.

You can watch the video as it is being seen by the camera via streaming. For that, you can use any program able to stream videos via network, such as MPlayer, VLC and the likes. In this link of Elphel’s wiki, there are examples on how to do that. This procedure is common in case you will not record in any of the RAW modes.

The usual scenario, however – at least concerning Digital Cinema -, is that you will want to record your video using RAW. In order to be able to really watch your scene, you need a software that does the debayering in real time, GStreamer, transforming a bunch of small black-and-white squares into a colored transmission.

If you want a detailed overview of this process, you can see Alexandre Poltorak’s article at Elphel’s Development Blog. Notes on how to install it can be found at Elphel’s wiki. And the best link on how to use it, full of useful examples, can be found at Google Code.

There is an additional and important information from the examples that can be found in all those links. It will be very common, in a field situation, that you will want to watch the video you have just recorded. Since it will be a large file, it is unpractical to send it over LAN to your base computer to watch it.

So there must be a way of, instead of just watching the live stream coming from the camera or watch a video stored locally at your computer, to watch a file that is stored in a remote media (your external HDD or CF card) via LAN. This is what this line does:

gst-launch-0.10 souphttpsrc location= ! decodebin ! ffmpegcolorspace ! queue ! jp462bayer threads=1 ! "video/x-raw-bayer, width=(int)568, height=(int)240, format=(string)gbrg" ! queue ! bayer2rgb2 method=0 ! queue ! ffmpegcolorspace ! autovideosink

When Elphel records it uses very large names for the files. It becomes also unpractical to inform this name to GStreamer every time you want to watch a recorded video. The easiest way to overcome this is to first create a symbolic link to the file you want to access using a very easy and short name, then informing this name to GStreamer. To make things easy, let’s transform this into a script:

Copy this script to your camera and save it in the same folder as your videos as “link_video.sh”:


# Run this script as follows:
# sh link_video.sh [NUMBER]
# In which [NUMBER] is the video you want to watch;
# the count is made backwards, so 1 will be the last
# recorded video, 2 the one before it and so on.

VIDEO=`ls -t *.mov | head -"$1" | tail -1`;
ln -s `pwd`/$VIDEO `pwd`/video.mov;

Now, if you want to watch the last video you recorded, you’d run from inside the camera:

sh link_video.sh 1

Then, in your base computer, the one that is accessing the camera via LAN (considering a fullHD video):

gst-launch-0.10 souphttpsrc location= ! decodebin ! ffmpegcolorspace ! queue ! jp462bayer threads=1 ! "video/x-raw-bayer, width=(int)1920, height=(int)1088, format=(string)gbrg" ! queue ! bayer2rgb2 method=0 ! queue ! ffmpegcolorspace ! autovideosink

Of course, you can also transform this large command line into a script, so that you just run a simple command. It would be something like this (save it as ‘watch_video.sh’):


# Run this script as follows:
# sh watch_video.sh


gst-launch-0.10 souphttpsrc location=http://$IP/hdd/video.mov ! decodebin ! ffmpegcolorspace ! queue ! jp462bayer threads=1 ! "video/x-raw-bayer, width=(int)1920, height=(int)1088, format=(string)gbrg" ! queue ! bayer2rgb2 method=0 ! queue ! ffmpegcolorspace ! autovideosink

Also, you can customize this script to the resolution you’re using, your IP and so on.


As it has been said, Elphel is very versatile in many aspects. According to this statement, there are three ways of recording footage when using the camera.

First, there is Elphel’s user interface (camogmgui). It is easy to use and well documented, but it won’t be our choice. When recording at high frame sizes, such as our case, we have to know if we are dropping frames. Camogmgui has the option of exhibiting its buffer (the way to monitor framedrop), but when it does that, a php process runs every 2 seconds to update this interface information for us and drains some of the camera processing power. If we are close to the limit (see previous section), it will also fail to update the buffer information accordingly.

The second option is ElphelVision. This is a remarkable user interface, that stands many steps above camogmgui in terms of design. It is also designed for Apertus, so it translates some important information (such as shutter speed, for example) to a language that photographers of cameramen are more used to. It is easy to install and also well documented, but it will also not be our choice.

In our project, we are using an old laptop as basis for camera controling and part of the monitoring. ElphelVision was designed to run on newer machines, with newer graphical card drivers – even the proprietary ones. It runs two instances of Java, one of which is a transparent layer that stands in front of the video streaming coming from Elphel – this layer, however, is only transparent when you use newer graphical drivers. So we have to keep minimizing the second java window to use it in our old laptop. Also, it streams the video using VLCj, that has higher latency than MPlayer or GStreamer in our machine.

The third option, the one we will use, is command-line recording. You have to enter the camera using a shell and start the processes from there. This demands very low resourses. However, since the lines are huge, it is easier if done by scripts.

The three scripts above do the following:

1. Kill all camogm and autoexposure processes in the camera and starts a camogm instance from that same shell;
2. Starts recording in MOV format, using parameters previously set in the camera. It also reinforms the duration limit (600s) and the maximum video length (aroung 10GB per file, more than enough!), to assure we won’t have a split in the middle of a scene, dropping frames;
3. Stops recording.

The scripts don’t have a license here, but they are GPL, so use at your own will.


killall camogm;
killall autoexposure;

camogm /var/state/camogm_cmd &

echo "status; exif 1; format=mov; duration=6000; length=10000000000; status=/var/tmp/camogm.status" > /var/state/camogm_cmd &

echo "Camogm has started and is waiting for recording..."


killall autoexposure;
echo "Started recording..."

echo "status; exif=1; format=mov; duration=6000; length=10000000000; prefix=$MOUNT_POINT/$RECORDING_PREFIX; start; status=/var/tmp/camogm.status" > /var/state/camogm_cmd &

echo "status; stop; status=/var/tmp/camogm.status" > /var/state/camogm_cmd &
echo "Stopped recording."
ls -t $MOUNT_POINT | head -1


To end this section, it must be said the our old laptop, a single-processor Pentium M 3.0 GHz (equivalent to a Pentium 4) will strugle to debayer the RAW video in real time, even if using GStreamer at 1/4 of the original resolution. It presents higher latency than it should and drop frames. This has a simple solution, though. Using top, we can see that what is draining power is iceweasel (we are on Debian). Just close iceweasel’s main instance, leaving open only Elphel’s window – that same image we saw right below last section’s title (Effective recording quality). You will be fine! Latency will drop enormously and the video will play nearly real-time.

Building a wise setup to monitor your RAW recording

Currently, Elphel does not have a viewfinder. There are ideas and prototypes on how to build one, but there isn’t anything concrete yet. So there are three main possibilities to monitor what you are recording.

The first one is the most common. It assumes you will use a laptop or any portable computer, plugged to the camera via LAN cable. This computer can either be at some meters from the camera or constructed in a way that it gets nearly ‘glued’ to it. At Apertus’ site, you can see some pictures that exemplify this kind of prototype and monitoring.

If you have a laptop tied to your setup, you can control the camera and monitor the recording, but this may be somewhat cumbersome in the field, where you will have to pay attention to many things beyond the camera controls. And the controls, as we have seen, can depend on terminal. Even with ElphelVision and a touchscreen device (terminal independent), this could be too much.

If you have the laptop a bit distant from the camera, the person with the laptop can be ok, but the camera operator won’t have anything to be based on.

The second and third possibilities consider you have two people operating the camera. One in the laptop further from the scene (this can include other people, such as Director and Director of Photography), and other with the camera on her/his hands/tripod. This can be seen in the image above.

So what you can do if you have a smartphone, or any device with better wi-fi such as a tablet, is to receive the stream from the camera via LAN cable in a laptop and re-send it to this mobile device via wi-fi. The better the wi-fi of your device, the better this setup may work.

The way to do this is to install a program called VNC (x11vnc) as a server in the laptop connected to the camera and VNC as a client in the smartphone or tabled that will be used by the person with the camera in hands, to monitor. There is a little lag between the actual scene the actors are performing and what you will actually see in the smartphone.

The two main problems are (1) this process can use quite some battery of the gadget, so it’d be better connected to a power source, increasing the number of cables in the setting, and (2) if the wi-fi board of the device is weak, you will have very low FPS showing there.

If you consider, however, shots without moving cameras, or in which the camera moves very little, this can be a valid solution (use what you have in hands!). My girlfriend had a mobile that I used to test this and things went sort of fine. The line I’d use in the laptop to send the signal was:

livre@laika:~$ x11vnc -forever -clip 80x44+0+85 -shared -viewonly -nodragging -noxdamage -notruecolor -speeds modem -fs 0.75

Notice I’m sending a very tiny portion of my Desktop (an area of 80×44) via wi-fi. Then, I’d receive it using tightVNC and use the options to receive the signal at ¼ of the original resolution, then autoscale it to fit the screen, force 8 bits and disable the controls. As you can expect, the result is a rough video, but enough to allow you to put the actors into frame and keep the frame in control.

All the process on how to install and use VNC is described below.

The last possibility we can use to monitor our recording is the simplest, the less painful and, probably, the best way of doing this. You can think the way you will mount the whole setup (camera, lens, external HDD etc) in a way Elphel gets upside down. This will free the tripod hole it has under it so you can use an screw without head to mount a second light camera in it. In the image above, you can see a screw with head in the hole. Also, this has the extra advantage of showing Elphel’s logo to everyone (proving we have an Open Hardware project).

Here, you see how it looks like in the field. You will notice the sensors of both cameras are nearly the same distance from the object. Then you just find an equivalence between Elphel’s lens and the zoom of your second camera.

This idea is inspired on Oscar Spierenburg’s early prototypes, in which he used a photographic camera besides Elphel. However, by inverting Elphel and freeing the tripod hole, we can have a much better relationship between both devices.

Have in mind though that you will be working in two people as cameramen. The other person, at the laptop, must make sure everything is running fine with the camera controls.

A note on GStreamer and the upside-down Elphel

When using GStreamer with an Elphel mounted upside down, this part of the code you use for streaming:

"video/x-raw-bayer, width=(int)960, height=(int)544, framerate=(fraction)24/1, format=(string)gbrg"

Has to be replaced by the swapped channels of color (gbrg):

"video/x-raw-bayer, width=(int)960, height=(int)544, framerate=(fraction)24/1, format=(string)grbg"

The x11vnc how-to

Install these two programs on the computer connected to Elphel via LAN cable: tightvncserver and x11vnc;
apt-get install x11vnc tightvncserver

Run this command, as non-root and set up a password for your connection:

Run as non-root:
x11vnc -forever -shared -rfbauth ~/.vnc/passwd

In another terminal, get your IP address (in Debian, you have to be root to do that):

Now go to your other laptop or smartphone. Install any VNC viewer. In my case, I’ve installed a program called “TightVNC Viewer”. I tested it with Linux, Windows and Windows Mobile, they all worked.

Skip the passwords configuration during installing and run only the program TightVNC Viewer – you don’t need to run the server;

In the box, put the IP number of your host machine followed by “:0”. In my case, my IP is, so I’ll just put:

The program will prompt you for the password you set at step #4 of the previous section. You are connected. And not only connected, if you have touchscreen, you can even the other machine. In the previous section, you have the line I used that worked best to send the signal from my laptop to the smartphone.


The workflow to edit RAW videos in Linux is a complex topic on itself, and is subject of another article. The research is quite complete and can be found in details in this link. As you can see there, editing is a process that highly depends on scripts of automation and processing power. With the automated processes, editing RAW in Linux becomes close to the HDV or DV workflows that currently exist.

Focus pulling

Focusing is one of those issues that are crucial when acquiring images. Due to the lack of a viewfinder and to the way we have to monitor our recording (see section “Building a wise setup to monitor your recording”), it is also one of the most difficult tasks when using Elphel.

Elphel does have an internal program designed to help you with that. You can access it with the following address:


As you can see from the link above, if you chose the method “Show: focus”, the camera gives you a black and white image of the edges of your pictures, giving you an idea of where the focus is.

Now, you must have in mind that when using the camera, you have to built a setup where you will put everything together – the camera, the lens, the external HDD, the cables and possibly a photographic camera to be used as viewfinder. You will also probably have to control your lens manually to focus (even though you can think of using an Arduino built up to help you with this, but that’d require time and development for it doesn’t exist yet).

I’m saying all that because to shoot our movie, we are taking all that into consideration to decide that we will use a fixed focus point, balancing it using the iris opening and the depth of view.

Technically, you have a program that should help you a lot with that. It’s called DOFView. In the images below, I have adjusted it both to Elphel 353’s sensor size (5,7mm x 4,28mm) and to my lens’ zoom factors (7.5mm to 75mm).

I can than inform my aperture’s F-number (f/1.8) and the object’s distance (2m). It calculates everything for me.

Putting the focus to the test

Now, even though I do trust the technique very much, I am uncomfortable when dealing with an old lens that has passed through many situations I am not aware of, and that is not one of the lenses that usually come with the Elphel cameras. I’m using a Fujinon MA-Z lens, for Super 8 movies, as below.

Also, we plan to shoot a movie, with actors, production, art, professional lightning etc. It would be a shame to trust blindly the theory and discover we have recorded everything and it was all a bit out of focus.

So I put it all to the test. I placed many objects in the entrance of my house, spacing them 1 meter from each other. The door at the back of everything stands at 12 meters from the edge of my lens and gives us a good idea of “infinity”, since the glasses have friezes that serve as edges. This is sort of how it looked like:

Now, for you to have an idea, the image above has been taken with the lens focus pointing to infinity and aperture at 5.6. The extreme to that is the image below, with the focus set at 1.5m from the lens and 1.8 aperture.

The images we have seen are not surprising at all. What is surprising is that the usual relationship between focal distance and lens aperture that usually occurs in photography didn’t quite work for me.

I found out my Fujinon lens gives me somewhat similar results when I use the same aperture (say, 1.8) with different focus distance (say, 1.5m or 20m). Of course, at 1.5m distance, the whole background is extremely out of focus, but at 20m, the background is still sort our blurred and the best focused thing is the book at 1m distance from the lens!

In practical terms, the tests show if we want things to be in focus, we must use an aperture of at least 4. If we want to isolate the subject in the foreground, then we can think of using apertures of 1.8 or 2.8. The table below gives a rough idea of what we can expect of the lens, in terms of focus.

Full-frame as resource to focusing

One last note on this subject is that you can use a still, full-frame image from Elphel’s sensor to have an idea if things are focused. In the camera controls, there is an icon that looks like a painting with a tree on it.

The best thing then is to shoot this frame black and white and not in any RAW mode, so that you can have a good idea as to how the image looks.

Effective recording quality

Elphel is very versatile. It allows you to record using customizable frame sizes, fps, window of interest and compression ratio, among other things. When shooting at the highest quality available, the results can be impressive, especially when you consider we can use RAW.

For Digital Cinema, in which high resolutions are used, a common concern arises. Is there a way of knowing what the real maximum quality we can get is? What exactly does the JPEG Quality percentage mean in terms of final output? We will see, for example, that we can never achieve 100% for larger images.

To answer these questions, we have to take a look at an old and excellent MPlayer article called Encoding Quality – or Why Automatism is Bad. It tells us that a way of measuring video quality is by knowing its effective bits per pixels (BPP). Depending on the value you get, sometimes it is even more important to increase it, at the cost of having a smaller frame size.

The BPP can be found with this formula:

         $videobitrate * 1000       
$bpp = -----------------------
       $width * $height * $fps
($videobitrate is in kbit/s)

To run this test, we have to record many scenes, each with a different JPEG Quality value, fetch their kb/s with FFMPEG and calculate. Let’s try it with JP46 RAW in two different resolutions: 800×608 and 2064×896. The first is just a small frame size, to make sure we can go all the way until we reach 100% quality; the second is a value I am experimenting for shooting – it is equivalent to 200% HD (720p) and 88,5% full HD (1088p).

We’ll use the following information from the camera:

First round: 800×608 @ 23.98 fps JP46
JPEG Quality        Resulting video kb/s         Frame size (K)    Resulting BPP
100%     41081 kb/s          207k      3,5 bpp
99%       38566 kb/s          194k      3,35 bpp
98%       29804 kb/s          149k      2,58 bpp
97%       23568 kb/s          118k      2,04 bpp
96%       19566 kb/s          97k         1,69 bpp
95%       15247 kb/s          82k         1,32 bpp
90%       7410 kb/s            47k         0,64 bpp
85%       5938 kb/s            34k         0,51 bpp
80%       4229 kb/s            19k         0,36 bpp
75%       3597 kb/s            17k         0,31 bpp

Second round: 2064×896 @ 23.98 fps JP46
JPEG Quality        Resulting video kb/s         Frame size (K)    Resulting BPP
100%     158049 kb/s        798k      3,56 bpp
99%       140456 kb/s        711k      3,16 bpp
98%       106150 kb/s        537k      2,39 bpp
97%       82918 kb/s          419k      1,86 bpp
96%       66954 kb/s          338k      1,50 bpp
95%       51428 kb/s          262k      1,15 bpp
92%       34776 kb/s          154k      0,78 bpp
90%       26437 kb/s          132k      0,59 bpp
85%       17372 kb/s          86k         0,39 bpp
80%       12573 kb/s          59k         0,28 bpp
75%       11003 kb/s          53k         0,24 bpp

The result of this test is better expressed below:

Some things become clear now. First: as long as the JPEG Quality value in the camera is the same, the resulting BPP is equivalent for different resolutions, which is somewhat expected. Second: the BPP of the image decreases drastically in the region between 95-100%, forming a steeper curve. Once you record below 90%, the resulting BPP doesn’t change that much.

I’ve also added the output I got when probing a DV and an HDV MPEG2 file, both recorded with my Sony HDR-FX1 – since the camera can only record one quality, I assume it to be “100%”. Even though the MPEG2 is not a RAW format file, both can give us an idea where we are with Elphel. For completeness sake, here is their full info:

Duration: 00:14:59.89, start: 0.000000, bitrate: 28774 kb/s
Stream #0.0(eng): Video: dvvideo, yuv411p, 720x480 [PAR 8:9 DAR 4:3], 28771 kb/s, 29.97 fps, 29.97 tbr, 30k tbn, 29.97 tbc

Duration: 00:58:53.49, start: 0.433767, bitrate: 26128 kb/s
Program 100
Stream #0.0[0x810]: Video: mpeg2video (Main), yuv420p, 1440x1080 [PAR 4:3 DAR 16:9], 25000 kb/s, 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbc

Now we know the maximum quality that can be achieved by Elphel, in comparable terms. One must have in mind though that our intention is to shoot for Digital Cinema.

As mentioned, when using high resolutions, we won’t be able to get to 100%. Sometimes, the restriction may come from the media we’re trying to record into. Such is the case when using LAN CAT5 Cables or incompatible, buggy, Compact Flash cards (see below). If we use external HDD drives (IDE or SATA), the restriction will come from Elphel’s CPU – it won’t be able to process such amount of data.

The deal with Elphel and CF cards

Before we continue, it may be worthy explaining why you won’t be able to use many Compact Flash cards that exist in the market with Elphel. Special thanks to Andrey Filippov, Elphel’s creator and developer, and Oleg Dzhimiev, Elphel’s engineer, who were of particular help in this section.

Most CF cards say they support PIO, DMA and UDMA modes. What happens is that a lot of them support just the first and the last ones. Since we usually use the faster, UDMA mode, this fault goes unnoticed. The Axis CPU used in the camera also has a bug – when the card does not support DMA, Elphel ‘reads’ it as PIO (it won’t use UDMA). This makes recording in the card slow and also drains the camera’s processing power, as will be clear in the following tests.

Elphel uses and recommends the “Sandisk Extreme III” CF card. It also has a blacklist of the known buggy CF cards (lines 103~136).

I had two 4GB Kingston 133x (20 MB/s) CF cards I wanted to use, which were not on the blacklist. We put them to the beat to see if they supported DMA or not – in other words, if they could be used or not with Elphel.

The first test is made with the cards inside the camera. You can see the videos on how to assemble/disassemble it to connect them, and follow this guide to format them to ext2. Then, use telnet or ssh to reach Elphel’s terminal and mount the CF cards. From there, run:

time dd if=/dev/circbuf of=/dev/hda

This command will write the ~19MB circular buffer (/dev/circbuf) from the camera in the card (in this case, /dev/hda). It will also erase all the data in the media – in my case, I even had to format it again. The result will be something like:

[root@Elphel353 /mnt/0]944# time dd if=/dev/circbuf of=/dev/hdb1
38656+0 records in
38656+0 records out
real 0m 6.75s
user 0m 0.19s
sys 0m 2.46s

It means that my Kingston CF cards are taking 6.75 seconds to write the 19MB. Since 19 MB / 6.75s = 2.8MB/s, this should be the data rate my cards can handle. In other words, they are operating in PIO mode, not in DMA as they were supposed to.

Another command that is used to test the writing speed of a certain media is “hdparm –T –t”. This gives me a somewhat different output:

[root@Elphel353 /mnt/1]865# hdparm -T -t /dev/hda
Timing buffer-cache reads: 46 MB in 0.51 seconds = 91402 kB/s
Timing buffered disk reads: 23 MB in 3.05 seconds = 7708 kB/s

[root@Elphel353 /mnt/1]865# hdparm /dev/hda
multcount = 0 (off)
IO_support = 0 (default 16-bit)
unmaskirq = 0 (off)
using_dma = 0 (off)
keepsettings = 0 (off)
readonly = 0 (off)
readahead = 256 (on)
geometry = 8060/16/63, sectors = 8124480, start = 0

As we can see, hdparm tells me my cards can write nearly at 8 MB/s and they have the option “using_dma” disabled. I tried turning it on, but the result didn’t change:

[root@Elphel353 /mnt/1]865# hdparm -d1 -X mdma2 /dev/hda
setting using_dma to 1 (on)
setting xfermode to 34
(multiword DMA mode2)
using_dma = 1 (on)

[root@Elphel353 /mnt/1]865# hdparm /dev/hda
multcount = 0 (off)
IO_support = 0 (default 16-bit)
unmaskirq = 0 (off)
using_dma = 1 (on)
keepsettings = 0 (off)
readonly = 0 (off)
readahead = 256 (on)
geometry = 8060/16/63, sectors = 8124480, start = 0

[root@Elphel353 /mnt/1]865# hdparm -T -t /dev/hda
Timing buffer-cache reads: 48 MB in 0.52 seconds = 92824 kB/s
Timing buffered disk reads: 24 MB in 3.05 seconds = 8036 kB/s

Since the outputs are conflicting, I tried removing the cards from Elphel to check them in my computer and in my laptop. They give me errors of “bad/missing data” but curiously register the same writing speed as the first test, 2.7 MB/s.

root@laika:/home/livre# hdparm /dev/sde
SG_IO: bad/missing sense data, sb[]: f0 02 05 00 00 00 00 0a 00 aa 55
42 20 00 02 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
multcount = 0 (off)
readonly = 0 (off)
readahead = 256 (on)
geometry = 8060/16/63, sectors = 8124480, start = 0
root@laika:/home/livre# qt
qtconfig qtconfig-qt4 qt-faststart qttoy4m

root@laika:/home/livre# hdparm -Tt /dev/sde
Timing cached reads: 12352 MB in 2.00 seconds = 6180.82 MB/sec
Timing buffered disk reads: 10 MB in 3.68 seconds = 2.72 MB/sec

The next tests (below) will suggest my CF cards lag the writing speed required when the data transfer reaches the level between 7.1~7.6 MB/s, causing ‘buffer overruns’. They also show Elphel uses more CPU when trying to write on them than when writing on other media. Overall, we can say we now know the limits of the cards and that they do not compensate if we have an external HDD at hand.

Now, let’s do some more tests in which we consider the media we’ll be recording with and the issue of having or not having a live streaming.

Recording with bugged CF Cards

Even if the Kingston cards cannot reach the UDMA recording speed of 20 MB/s, I still want to know how far they can take me.

So the first test is to determine how much CPU it takes to stream the video live to a computer with GSTreamer, for monitoring. The idea is to have an isolated measurement for future reference.

Streaming statistics (no recording, just streaming):
2064×896 @ 23.98 fps JP46
JPG Quality          Frame size (K)    CPU        Stream (MiB/s)
100         1041      84-89%                 8,0-8,5
99           953         75-81%                 7,7
98           793         76-78%                 7,6
97           678         77-82%                 7,3-8,1
96           576         58-66%                 7,0
95           471         55%                       5,9 *
94           420         57-93%                 9,2-10,0 *
93           368         77-84%                 8,7 *
92           318         58-72%                 7,8 *
91           296         61-66%                 4,7-5,8
90           272         41-59%                 4,8-6,6
85           186         34-39%                 4,3-4,5
80           140         30-31%                 3,4
75           110         23-24%                 2,7-3,0

* The probable explanation for this behavior is that Elphel was ok with streaming this video up to 92%-94% quality, where it starts to oscillate clearly, then suffer. Beyond that it probably has decided to discard frames to diminish the load – that’s why the CPU falls at 95%.

The second test is only recording. All the php instances running on Elphel were killed and recording was made using camogm via terminal. I used the same environment as before.

Recording statistics (no streaming):
2064×896 @ 23.98 fps JP46
JPG Quality          CPU
95%       52-65% * constant buffer overruns
94%       59-83% * buffer overrun
93%       60-78% * buffer overrun
92%       56-82% * buffer overrun
91%       59-77%
90%       46-65%
85%       37-43%
80%       29-30%
75%       21-23%
50%       5-12%

Well, my CF cards start to cough when I reach a frame size of more than 300k (above 91% JPEG Quality). This means close to 7.2 MB/s. Now we must do both things together.

Recording + streaming statistics
2064×896 @ 23.98 fps JP46
JPG Quality          CPU                        Streaming (idle, before recording) Streaming (MiB/s)
90%       28-33%                 59%       57% * buffer overrun
89%       31-40%                 51%       46-55%
88%       28-36%                 49%       43-54%
87%       27-32%                 46%       47-49%
86%       23-32%                 44%       45-51%
85%       19-26%                 40%       50%
80%       13-17%                 30%       42%
75%       10-14%                 23%       35%

When I record and watch the live video, I have a problem above 89%. In other words, this is the highest quality I can achieve with my buggy CF cards. If I decide not to use the streaming, it will be 91%. It must be said, also: latency was horrid even when recording at 75% quality.

Recording with an external SATA drive

By incredible coincidence, while I was doing these tables, I was able to get a SATA drive from an unused computer. Let’s see how Elphel’s CPU will behave with it, by doing the same tests. Since the streaming-only benchmark will be the same, that will be skipped here.

Recording statistics (no streaming):
2064×896 @ 23.994 JP46
JPEG Quality        Frame size (K)    CPU
100%     928k      56-67% * constant buffer overruns
99%       834k      49-74% * buffer overrun
98%       664k      59-61%
97%       544k      47-49%
96%       457k      36-39%
95%       379k      31-34%
94%       326k      26-28%
93%       286k      23%
92%       253k      20%
91%       240k      19%
90%       223k      18%
85%       163k      13%
80%       128k      10%
75%       108k      9%
50%       71k         6%

The difference is huge and shows how right Andrey is.  Now I can record up to 664k (98% JPEG Quality) per frame, instead of the 300k (91%) I got with the buggy CF cards.  It seems more than double, but when you actually see the BPP table up there, you realize it’s actually more than 3x the quality I had got (remember the steep curve). What about recording with streaming?

Recording + streaming statistics
2064×896 @ 23.994 JP46
JPG Quality          CPU                        Streaming (idle, before recording) Streaming (MiB/s)
97%       553k      60-64%                 61-64% * buffer overrun
96%       467k      51-52%                 53-56%
95%       387k      60-88%                 47-48%
90%       225k      46-50%                 64-66%
85%       163k      36%                       48-51%
80%       128k      27%                       38-41%
75%       108k      23%                       31-33%

That’s it. If I choose to record and have a live stream, the maximum quality I’ll get without putting myself into trouble is 96%; if I record without the stream, it will be 98%. I can only get to 100% if I record at lower resolution.

It must be said, though, that recording at 467k (96%) is very risky. I have to kill all php and autoexposure settings, and any scene that demands more information can result in dropping frames. So I would recommend using lower values than this one.

What about JP4?

The only question that remains is: what if we use JP4 instead of JP46 for recording? This assumes we won’t have a stream, even if we wanted, because there is no software that decodes JP4 live yet. The JP4 is a RAW format that skips some internal camera processing steps, so we could benefit from just dumping the RAW data into media. How much does it really apply for us? Will we get higher quality?

Recording statistics (no streaming):
2064×896 @ 23.994 JP4
JPEG Quality        Frame size (K)    CPU                        Frame size (K) for JP46 (as reference)
99%       828k      50-75%                 834k * buffer overrun
98%       656k      61-62%                 662k
97%       534k      41-42%                 543k

The answer is direct: no. If we step one degree up, to try 99% JPEG Quality, we get a buffer overrun. The difference in size for a frame at this resolution is not enough to allow us to go further, even by skipping Elphel’s internal processing steps.

Summing up:

Maximum quality that can be achieved in my setup for 2064×896 @ 23.994 JP4:

Media                                    JPEG Quality with streaming           JPEG Quality without streaming
Buggy CF card:                    89%                                                      91%
External SATA drive:         96%                                                      98%

The problem with streaming: I can record only the streaming to a computer and then use it to edit my videos. As we have seen, it will give me something between 92% ~ 94% JPEG Quality. However, there doesn’t seem to be a way yet of converting the stream recorded by GSTreamer to DNG sequences (for example, by using the movie2dng software). In practice, this means we won’t have the RAW advantages.

The higher the resolution, the higher the frame size. So another way of looking at the same issue is to consider how much data you need to transfer, both to stream and to record. In my case, my safe margin seems to be:

Maximum data rate (frame size) that can be achieved for 2064×896 @ 23.994 JP4:
Media                                    Frame size with streaming              Frame size without streaming
Buggy CF card:                    250k ~260k                                        296k ~ 300k
External SATA drive:         below 470k                                       664k ~ 760k

This way I can play with resolution – increase the frame size and decrease JPEG Quality.  As long as I keep close to these values, it should work and I shan’t have buffer overruns and dropped frames.

Shutter Speed and Exposure time cheatsheet

In case you are not using ElphelVision to do your monitoring and recording, you will need a cheatsheet to set the exposure time (shutter speed). This is because in Elphel’s interface, this number is in miliseconds. So here it goes:

Exposure time vs Shutter speed in Elphel
Shutter speed   Exposure time   ms
1 sec     1     1000 ms
1/2     0,5     500 ms
1/4     0,25     250 ms
1/8     0,125     125 ms
1/15     0,066     66 ms
1/24     0,041     41 ms
1/25     0,040     40 ms
1/30     0,033     33 ms
1/60     0,016     16 ms
1/90     0,011     11 ms
1/125     0,008     8 ms
1/250     0,004     4 ms
1/500     0,002     2 ms
1/1000     0,001     1 ms

Knowing that:
Halving the shutter speed = +1 Exposure Value (EV);
Every 1.4 increase in aperture size (1 F-Stop) = +1 Exposure Value (EV)

1.4     2     2.8     4     5.6     8     11     16     22

Audio recording

The Elphel 353 cameras have a USB 1.0 port on their back. It can be used to record audio together with the video, in a separate file. Elphel’s wiki has two links that describe this process, one describing Alsa in the camera context (the 353 model already has Alsa installed) and the other showing how Jack can be used with Timemachine for audio recording purposes.

It is actually quite easier to use arecord, a small command line program that is part of the Alsa packages. As can be seen throughout this article, we have used the command line for recording, in order to achieve the highest quality possible (meaning: the camera processor is then freed from graphical interfaces demands and can be more focused). So it makes sense to incorporate the audio recording task inside this same script.

Now, when running it, the script will ask for a basename for your file. This basename will be placed before the large numeric name Elphel uses as default for recording, making it easy to check which audio file refers to which video file.

Below, you can see the first script, just for recording audio; and the second script, that records both, using the same method we described on the “Recording” section.



if [ -z "$1" ]; then
# Sanity check. User must inform the name for the sound file.
echo -e "\e[0;31mYou must inform a name for the sound file.\n\
Re-run the script this way 'sh script.sh name_of_the_file'.\e[00m" && exit;

IDENTIFIER="USB Audio"; # change to Microphone's identifier
AUDIO_CARD=`/bin/arecord -l | grep "$IDENTIFIER" | grep -o "[0-9]" | head -1`;
AUDIO_DEVICE=`/bin/arecord -l | grep "$IDENTIFIER" | grep -o "[0-9]" | tail -1`;

/bin/arecord -f S16_LE -r 48000 --vumeter=mono -D hw:$AUDIO_CARD,$AUDIO_DEVICE $MOUNT_POINT/$FILE_NAME.wav




echo "Started recording..."

killall autoexposure
sh audio_rec.sh $1 &

echo "status; exif=1; format=mov; duration=60000; length=100000000000; prefix=$MOUNT_POINT/$1; start; status=/var/tmp/camogm.status" > /var/state/camogm_cmd &

During the recording session, arecord will display a volume checker (meters) on the terminal. During editing, you will have to sync the files (dailies) using their names and the claquette as reference. The following screencast shows how the video above has been edited. It was shot in RAW JP46 and had to be debayered. In Cinelerra, I used my clapping hands as visual and as sound keys for syncing. Then, it was ready to be watched.

Back to the main linux page.


Written by qazav_szaszak

4 de junho de 2012 às 13:40

%d blogueiros gostam disto: