Capturing with Arducam MIPI monochrome global shutter cameras

31 Aug.,2023

 

Wed Apr 01, 2020 4:35 pm


Scene was lighted with 5000lm led from outside, shutter speed was 100µs:
https://www.raspberrypi.org/forums/view ... 0#p1633505

Even though shutter time was really low, the frames are still shaky (right with center camera):



So I wanted to mount one of the monochrome global shutter cameras onto the robot and see the difference to rolling shutter camera. Because I later want to do live frame processing for autonomous robot drive, I was interested in low number of pixels. So I decided to go with smallest and cheapest (25.99$) ov7251 Arducam Mipi camera (the monochrome version of ov7750 color sensor):
https://www.uctronics.com/arducam-ov725 ... amera.html

There was raw_callback.c demo in Arducam MIPI_Camera repo, that just counted the frames recorded.
I enhanced that demo a bit, and just committed and push to my fork:
https://github.com/Hermann-SW/MIPI_Came ... fff641aa2a

First now "raw_callback" creates "frame.pts" (microsecond precision) timestamp file in current directory, in same format as raspivid "-pts" option writes.

Then, similar to raspiraw high framerate work, I utilized "/dev/shm" to optionally write out N raw frames as "/dev/shm/frame.%04d.raw10".

arducamstill demo output reveals that format stored is Y10P:

Code: Select all

...
Current mode: 0, width: 640, height: 480, pixelformat: Y10P, desc: (null)
...
https://www.kernel.org/doc/html/v4.19/m ... -y10p.html

This is a packed grey-scale image format with a depth of 10 bits per pixel. Every four consecutive pixels are packed into 5 bytes. Each of the first 4 bytes contain the 8 high order bits of the pixels, and the 5th byte contains the 2 least significants bits of each pixel, in the same order.

My Pi3A+ has 447756KB free space in /dev/shm, which is sufficient to store 1190 640x480 frames (384000 bytes).

I created new netpbm like tool y10ptopgm.c, which converts (640x480 only as of now) Y10P frame to portable gray map:
https://github.com/Hermann-SW/MIPI_Came ... 10ptopgm.c

Calling "raw_callback 500 1000" does capture 640x480 frames and stores the first 1000 frames with exposure 500 in /dev/shm. Timestamps of all frames that get captured in 10s get stored in frame.pts. With 500 exposure and less ov7251 captures 640x480 at 198fps framerate, faster than the ov7251 datasheet says:
https://cdn.datasheetspdf.com/pdf-down/ ... Vision.pdf

Capturing with exposure 1000 results in 100fps framerate, and experimenting shows that 620 exposure is the lowest with zero frameskips in 10s recording (161fps framerate). I used ptsanalyze tool for frame skip and delta analysis:
https://github.com/Hermann-SW/userland/ ... ptsanalyze

This is output of raw_callback execution:

Code: Select all

pi@raspberrypi3Bplus:~/MIPI_Camera/RPI $ ./raw_callback 500 1000
Open camera...
Hardware Platform: a020d3
Found sensor ov7251 at address 60
Setting the resolution...
Can't open the file
mmal: Failed to fix lens shading, use the default mode!
Current resolution is 640x480
Notice:You can use the list_format sample program to see the resolution and control supported by the camera.

Code: Select all

Read 0x3662 value = 0x01
Start raw data callback...
Total frame count = 1850
TimeElapsed = 10.000090
Stop raw data callback...
Close camera...
pi@raspberrypi3Bplus:~/MIPI_Camera/RPI $ 

Here you can see analysis of generated "frame.pts" timestamp file, 6% frame skips, no frame delta bigger than 10.090ms:

Code: Select all

pi@raspberrypi3Bplus:~/MIPI_Camera/RPI $ time ( ~/ptsanalyze frame.pts 0 | grep -v "^>" )
creating tstamps.csv
1850 frames were captured at 198fps
frame delta time[us] distribution
      1 5038
      2 5039
      3 5040
      3 5041
     12 5042
     85 5043
    666 5044

Code: Select all

    804 5045
    108 5046
     21 5047
      4 5048
      2 5049
      3 5050
      1 5051
      1 5052
      1 10083
      1 10084
      1 10085

Code: Select all

      1 10086
      7 10087
     33 10088
     81 10089
      6 10090
after skip frame indices (middle column)
131 frame skips (6%)

real	0m16.868s
user	0m12.426s
sys	0m8.294s
pi@raspberrypi3Bplus:~/MIPI_Camera/RPI $ 

Last night I did record frames with raw_callback and did move finger fast before the camera. Scene was lighted from 1m above with 1000lm lamp. Lux (lumen/area) reduces quadratically for longer distance. I just bought 10W 1000lm leds because of that, they will be mounted on raspcatbot front to light the scene and get brigher frames ("bring your own light"). This is animation I created from frames 0900-0999 from the capture, recorded with 198fps framerate (1861 frames in 10 second), played at 20fps, roughly 10× slower than real (frames will get brighter with the robot mounted 1000lm leds):



Here are the two simple tools I used to create the animation from the 100 pgm files.

"togg" converts the pgm frames to png format using netpbm tool, and then creates ogg video with gstreamer:

Code: Select all

🍓 cat togg 
#!/bin/bash

for f in frame.09*pgm; do pnmtopng $f > $f.png; echo $f; done

echo "now creating .ogg"
gst-launch-1.0 multifilesrc location="frame.%04d.raw10.pgm.png" index=900 caps="image/png,framerate=\(fraction\)20/1" ! pngdec ! videorate ! videoconvert ! videorate ! theoraenc ! oggmux ! filesink location="x.ogg"
🍓 

Code: Select all

🍓 cat pngs2anim 
#!/bin/bash

echo "now creating .anim.gif"
# doc:   http://blog.pkh.me/p/21-high-quality-gif-with-ffmpeg.html
# needs: ffmpeg

palette="/tmp/palette.png"

#filters="fps=15,scale=320:-1:flags=lanczos"
filters="fps=20,scale=640:-1:flags=lanczos"

ffmpeg -v warning -i x.ogg -vf "$filters,palettegen" -y $palette
ffmpeg -v warning -i x.ogg -i $palette -lavfi "$filters [x]; [x][1:v] paletteuse" -y x.anim.gif
🍓 

P.S:
If you look at my commit diff, you will see a new section commented out with link into the datasheet.
https://github.com/Hermann-SW/MIPI_Came ... fff641aa2a
Register 0x3662 allows to choose raw8 format instead of raw10.
If that will work, 5/4*1190=1487 frames would fit into /dev/shm.
With no loss in image quality, since y10ptopgm.c just discards the lowest two bits per pixel.
I will follow up with Lee from Arducam on that.

I was surprised that the values written to exposure register seem to correspond to 10 microseconds.
The values are multiples of t_row, the time it takes to transfer a frame row.
Unfortunately the datasheet mentions t_row in several places, but without any numbers.
With exposure 500 I saw 198fps framerate, so 10.52µs for t_row:
P.P.S:
Just realized that I will need exposure of 20 for 200µs shutter time, bright light is needed.

Recently I captured video with v1 front camera of raspcatbot driving with 1.14m/s backward along a tape line.Scene was lighted with 5000lm led from outside, shutter speed was 100µs:Even though shutter time was really low, the frames are still shaky (right with center camera):So I wanted to mount one of the monochrome global shutter cameras onto the robot and see the difference to rolling shutter camera. Because I later want to do live frame processing for autonomous robot drive, I was interested in low number of pixels. So I decided to go with smallest and cheapest (25.99$) ov7251 Arducam Mipi camera (the monochrome version of ov7750 color sensor):There was raw_callback.c demo in Arducam MIPI_Camera repo, that just counted the frames recorded.I enhanced that demo a bit, and just committed and push to my fork:First now "raw_callback" creates "frame.pts" (microsecond precision) timestamp file in current directory, in same format as raspivid "-pts" option writes.Then, similar to raspiraw high framerate work, I utilized "/dev/shm" to optionally write out N raw frames as "/dev/shm/frame.%04d.raw10".arducamstill demo output reveals that format stored is Y10P:My Pi3A+ has 447756KB free space in /dev/shm, which is sufficient to store 1190 640x480 frames (384000 bytes).I created new netpbm like tool y10ptopgm.c, which converts (640x480 only as of now) Y10P frame to portable gray map:Calling "raw_callback 500 1000" does capture 640x480 frames and stores the first 1000 frames with exposure 500 in /dev/shm. Timestamps of all frames that get captured in 10s get stored in frame.pts. With 500 exposure and less ov7251 captures 640x480 at 198fps framerate, faster than the ov7251 datasheet says:Capturing with exposure 1000 results in 100fps framerate, and experimenting shows that 620 exposure is the lowest with zero frameskips in 10s recording (161fps framerate). I used ptsanalyze tool for frame skip and delta analysis:This is output of raw_callback execution:Here you can see analysis of generated "frame.pts" timestamp file, 6% frame skips, no frame delta bigger than 10.090ms:Last night I did record frames with raw_callback and did move finger fast before the camera. Scene was lighted from 1m above with 1000lm lamp. Lux (lumen/area) reduces quadratically for longer distance. I just bought 10W 1000lm leds because of that, they will be mounted on raspcatbot front to light the scene and get brigher frames ("bring your own light"). This is animation I created from frames 0900-0999 from the capture, recorded with 198fps framerate (1861 frames in 10 second), played at 20fps, roughly 10× slower than real (frames will get brighter with the robot mounted 1000lm leds):Here are the two simple tools I used to create the animation from the 100 pgm files."togg" converts the pgm frames to png format using netpbm tool, and then creates ogg video with gstreamer:"pngs2anim" creates animated gif with nice colors (see doc link) at 20fps framerate, and scaled to 640 width (100%):P.S:If you look at my commit diff, you will see a new section commented out with link into the datasheet.Register 0x3662 allows to choose raw8 format instead of raw10.If that will work, 5/4*1190=1487 frames would fit into /dev/shm.With no loss in image quality, since y10ptopgm.c just discards the lowest two bits per pixel.I will follow up with Lee from Arducam on that.I was surprised that the values written to exposure register seem to correspond to 10 microseconds.The values are multiples of t_row, the time it takes to transfer a frame row.Unfortunately the datasheet mentions t_row in several places, but without any numbers.With exposure 500 I saw 198fps framerate, so 10.52µs for t_row:P.P.S:Just realized that I will need exposure of 20 for 200µs shutter time, bright light is needed.

For more information 0.3MP Global Shutter Camera, please get in touch with us!