Raspberry Pi-powered webcam

With the pandemic continuing to force many people, including myself, to work from home, effective communication tools have become paramount for day-to-day teamwork. Video chat in particular has caused popular models of webcams to be sold out for weeks or months. While I was lucky to get my hands on a Logitech C922 pretty early on, I wanted to experiment with other potential solutions, especially ones that let me control zoom and focus more precisely.

Image courtesy of the Raspberry Pi Foundation

This led me to the Raspberry Pi High Quality Camera. It’s a camera module, similar to the other Pi cameras that have been available for years now, but with a 12.3MP Sony sensor and support for C- and CS-mount compatible lenses. It also features a standard 1/4″-20 UNC tripod mount.

Hardware and software

I’ve got quite a few Raspberry Pi computers laying around, from the very first release (with a rather fragile full-size SD card slot) to the latest Raspberry Pi 4 varieties, so finding one to use wasn’t an issue for me. I purchased an HQ Camera module and a few random C- and CS-mount lenses. Unfortunately, the fixed focal length prime lenses that I initially got didn’t work well for me, as their optimal distance was outside of my setup. I eventually bought a zoom lens with manual focus, which has done its job pretty well so far.

Next, I had to figure out a good way to get the camera stream into my desktop computer. The raspivid application that comes with Raspberry Pi OS has the ability to stream video data over a network directly or via a pipe to another app. My first attempt involved using netcat to stream to my Windows PC, receiving the stream with ncat, and piping the data into MPlayer. It mostly worked, but the stream was laggy and had occasional hiccups, although those were probably caused by the Pi’s wireless connection. This didn’t seem like a good solution. Even if a more stable network connection solved both the latency and hiccup issues, there was still the problem of getting MPlayer (or something else) to act as a webcam. The closest I came to getting that to work was using OBS to capture the MPlayer window and export that stream as a virtual webcam, but this was rapidly becoming an extremely hacky solution. There had to be a better alternative.

A capture card on eBay

Enter: USB capture cards! An entire industry has sprung up around video game live streaming, play-throughs, speedruns, and similar forms of entertainment showing gameplay. Capture cards are used to perform the actual capturing and relaying of video and audio data from game consoles, as well as other sources, to PCs for streaming or recording. (The term “capture card” is a little misleading today. Historically referring to peripherals connected to motherboard expansion slots, its definition has now grown to include external USB peripherals as well.)

The specific capture card that I purchased happens to be widely available under various names on eBay and Amazon. This particular model’s feature listing is a little misleading, as it suggests that the device can record 4K streams. Well, it can capture 4K streams, and it can pass them through to the output HDMI port, but it can only output a maximum resolution of 1080p over USB. Still, that’s pretty good functionality for the price, and I don’t need 4K for what I’m doing anyway. As you might have guessed, I’m connecting the Raspberry Pi to the capture card and using that as a webcam. In fact, with this model anyway, the setup was a lot simpler than I anticipated. Instead of showing up in Device Manager as a custom USB peripheral requiring weird third-party drivers, it appears as a webcam! The only thing left to do is configure the Raspberry Pi to output the camera’s video stream to the main screen.

I chose a simple Pi configuration, at least for now. I might write something more advanced that allows live tweaking of camera settings later on, but at the moment, built-in applications are sufficient. I used Raspberry Pi OS Lite for speed. Since the camera preview can be rendered directly onto the Linux framebuffer, there was no need to install a GUI at all. A simple script was used to launch the app:

#!/bin/bash
TERM=xterm clear > /dev/tty1
raspivid -t 0 -drc med -sa 25

The first command, after the shebang line, clears the main screen’s terminal, which may still have output from the boot sequence. The second line then launches raspivid, which shows the video preview.

To get this script to run at startup, I wrote a basic systemd service:

[Unit]
Description=Pi Camera Service

[Service]
Type=simple
ExecStart=/bin/bash /root/service-script.sh

[Install]
WantedBy=multi-user.target

To install the systemd service, I did the following:

ln -s /root/camera.service /etc/systemd/system/
systemctl daemon-reload
systemctl enable camera

The enable command makes the service run automatically at startup. To launch it immediately, execute systemctl start camera.

Finally, a couple of tweaks had to be done to the boot config and kernel command-line files:

hdmi_group=1
hdmi_mode=16
console=serial0,115200 console=tty3 root=PARTUUID=b48c1aa6-02 rootfstype=ext4 elevator=deadline fsck.repair=yes rootwait logo.nologo vt.global_cursor_default=0

The config.txt file was modified to force 1080p, and the cmdline.txt file was modified to hide additional text output, the Pi logos, and the blinking cursor.

Mounting

To get everything situated on my monitor, I decided to design and 3D print my own mount:

// large vertical piece with thread
difference() {
linear_extrude(height = 74)
            square([80, 4]);

translate([40, -3, 4])
    rotate([270, 0, 0])
        english_thread (diameter=1/4, threads_per_inch=20, length=1, internal=true);

rpi_holes(5, 14);
}

// rear horizontal piece with thread
difference() {
translate([0, 0, 70])
    linear_extrude(height = 4)
        square([80, 60]);

translate([40, 38, 60])
    english_thread (diameter=1/4, threads_per_inch=20, length=1, internal=true);
}

// front horizontal piece
translate([0, -35, 70])
    linear_extrude(height = 4)
        square([80, 35]);

// front overhang piece
translate([0, -35, 64])
    linear_extrude(height = 10)
        square([80, 3]);



module hole(x, y) {
    color("red")
        rotate([90, 0, 0])
            translate([x, y, 0])
                cylinder(d=2.7, h=15, $fn=100, center=true);
}

// WARNING: these may be inaccurate!
module rpi_holes(x, y) {
    hole(x, y);
    hole(x + 58, y);
    hole(x, y + 49);
    hole(x + 58, y + 49);
}
OpenSCAD was used to design the mount

I used OpenSCAD to model the mount and then export the model as a 3MF file for slicing with Simplify3D. The english_thread function came from Dan Kirshner. In addition to the tripod mount holes, I also printed the nuts and bolts to go along with them. Creating the bolt was straightforward:

cylinder(r=8, h=6, $fn=6);
english_thread (diameter=1/4, threads_per_inch=20, length=1.5);

Creating a nut was also pretty clear:

difference()
{
    cylinder(r=8, h=6, $fn=6);
    translate([0, 0, -0.1])
        english_thread (diameter=1/4, threads_per_inch=20, length=0.5, internal=true);
}

This didn’t exactly turn out as I imagined, but it got the job done:

Two mounts in use

The 1/4″-20 threading on the mount and on the printed nuts and bolts worked surprisingly well for both mounting the HQ camera (the top bolt) and for changing the camera’s pitch (the bottom bolt). Since everything is made of PLA, there is no concern of damaging the monitor or the camera module, and the plastic is just malleable enough that I was able to use an existing tripod with metal threading as a makeshift thread chaser to ensure the printed threading was correct.

Due to the structure of my monitor, I was unable to use a single mount for both the camera and the Raspberry Pi: the HDMI connection location made that impossible — see the awkward angle of the Pi in the above photo. Also, I realized that I had no M2.5 screws, so I just used some twist ties to secure the Raspberry Pi to the second mount. That had the nice side effect of letting me avoid dealing with a potentially inaccurate calculation for the locations of the Raspberry Pi mount holes. (The mechanical drawing clearly indicates the distances between the centers of the mount holes, which is the information I used, but, after printing, the holes appear to be just slightly misaligned. I’m uncertain as to the cause.)

Overall, this was a fun project! Between designing and printing my own monitor mount and learning something about camera lenses (I still have a lot to learn in that area), I’ve got new experiences and useful knowledge for future projects. If you have any questions or suggestions, hit me up on Twitter.

P.S.: Possible future enhancements

A few things I might do later on to make all of this work better:

  • Replace the shell script for launching raspivid with a Python script utilizing the picamera package, so that I can adjust camera parameters in real time, possibly via a simple web interface
  • Fix the mounting holes in an updated design and use a single mount once my new monitors arrive (their design does not have the protrusion seen in the above photo)
  • Connect my microphone to the Raspberry Pi so that audio and video would both come from the same source