Sakura Time-Lapse Camera

March 27
2018

It’s cherry blossom season in Japan, and everyone loses their collective minds over it. There are official forecasts of when the trees will bloom. There are different words to describe the progression of the flowers. Admirers flock to tree-lined parks for picnics during the day, and then return again at night for lantern-lit strolls. There are special sweets and even seasonal beer cans.

As I was drifting to sleep one night, I thought about how lovely it would be to watch the bloom arrive and recede in timelapse.

Cherry blossoms in Yoichi, Japan

I remembered I had brought a cheap GoPro knock-off, an APEMAN-brand action cam, so on a lark I tested to see if it could act as a USB webcam—and indeed it can. If you start it up attached to a computer, it asks whether it should act as a video camera or mass storage device.

The SakuraCam began to take shape. I headed to the ¥100 store with the basic parts—the camera, a Raspberry Pi, and a 16000mAh battery pack—and played around with arranging everything in variously-sized plastic organizer boxes, imagining how the cables would be dressed and assessing them for weather resistance. I settled on a shallow toolbox-style one with a handle and toggle latch.

I made a coarse cut to allow the camera lens to stick through the case, then sealed up the gaps with hot glue.In retrospect, the mirror image arrangement would have avoided some problems. The camera’s USB connection doubled as its power supply, and the Raspberry Pi was in turn powered by the battery. A short script invoked fswebcam to capture a frame from the webcam at regular intervals, and purged the oldest frames when the SD card filled up. At one frame per minute sped up to 24 fps, I had enough space to store about 7 minutes’ worth of photos.

SakuraCam Mark I

Everything seemed to have fallen into place until, after a few minutes of testing at 5 fps, the camera reset. And then it reset again after another few minutes. Unluckily, upon reset it returns back to the mass storage / video camera prompt, which requires physical interaction—a hard failure in the field.

Unable to scrounge up another webcam, the project seemed unworkable. I thumbed through the camera’s settings, which include a time-lapse mode, but the interval can’t be set any longer than a few seconds. Then I noticed the Wi-Fi settings, and wondered whether I could use it as an IP camera.

In Wi-Fi mode, the camera creates a wireless network which you join from your smartphone, and then you are able to control the camera via an app. It’s not entirely clear whether there is an official app to do so, but CamKing seemed to be the closest thing, and while it is not the most well-crafted app in existence, it works. It allows remote configuration of some of the camera’s settings, such as the exposure value, and best yet, it can capture still frames at 5K resolution, far exceeding the 1080p I could get from the camera as a USB video device.

The only challenge now was figuring out the protocol for triggering a photo.


I pulled the Android APK for CamKing to decompile it, and found that it talks to a web server at 192.168.1.254 that serves a browsable directory index of the SD card, as well as a video stream on port 8192. Taking photos, changing settings, and so on are done by making a GET request with a corresponding command number:

# Set the clock
$ curl http://192.168.1.254/?custom=1&cmd=3006&str=15:43:22
<Function>
  <Cmd>3006</Cmd>
  <Status>0</Status>
  <Value>0</Value>
</Function>

Amazingly, the web server appears to be HFS, an open source web server for Windows. I was originally led to HFS by the HTTP headers, but dismissed it because it’s a GUI app, and, well, for Windows. Even when the API was returning paths starting A:\, I chalked it up to some confused developer. Then it dawned on me that HFS is running in Wine! Surely this was the most practical solution.

Another trick I learned is that rvictl -s [udid] on macOS will create an rvi# interfaceI wasn’t able to inject any packets, but I tried. that taps the network connection of an iOS device, a handy way to sniff the unencrypted traffic between CamKing and the camera as I mapped out the command numbers.

WireShark sniffing iOS traffic

The APEMAN uses a digital camera SoC from Novatek, a fact that is not well hidden: photos on the SD card are stored in a directory called NOVATEK/, and the USB vendor ID belongs to them. I suspect the SoC is a clone of the Ambarella sports camera SoC, once used in the GoPro, and has found its way into most of the sub-$100 action cams and dashboard cameras with unheard-of brands like Campark and Crosstour. Steven Hiscocks’s web interface to the YI Dash Cam, for instance, uses some of the same command numbers and so likely works with these other devices.

My Python module for communicating with the API is published on GitHub, although the code is very much a rough draft.


Porting the time-lapse script over to the new API was painless. However, as the Raspberry Pi now needs to be on the camera’s Wi-Fi network, I lose SSH access to monitor its status.The camera AP does not support multiple clients. I made two improvements to help:

First, I was able to wrest control over the green activity LED, a small feat on the Raspberry Pi 3 Model B, to blink out a status report after each capture.

Second, I configured the device to automatically join the camera network when it is broadcasting, and rejoin the home network when it goes away. This way I can easily gain debug access simply by powering down the camera.

I did not succeed at powering the camera off of the Raspberry Pi without triggering the USB mode selection menu. (It might be useful to know how in the future, but it wasn’t enough to de-authorize the device using udev.) But since the communication is now wireless, I was able to simply move the Raspberry Pi indoors and power the camera directly from the battery. This also pushed the battery life over 24 hours.


So, where’s the video? Ultimately, I didn’t capture the footage I had hoped for, and I decided to stop investigating histogram matching and tone mapping to improve the quality of the image. Hopefully I’ll be able to use what I’ve learned on another project.