2024-11-13 Fully functional again. The tail motor gear has been replaced, but I'm down to my last set of pinions.
What even?
So, you've just finished hacking up an old-time Big Mouth Billy Bass. Maybe you made it sing Bohemian Rhapsody or converted it into an Alexa. Whatever the case, I'm guessing that - like the original product - the novelty didn't last too long. I find that this is the kind of thing that is way more fun to build than actually do anything with, and the finished billy bass now sets on the finished project shelf, where you occasionally notice it staring at you with its cold plastic eyes.
With this undesirable outcome, I decided to give mine its third lease on life, using the method of last resort: connect it directly to the internet and see what people do with it (much like my thermal printer).
Big Mouth Billy Bass is a registered trademark of Gemmy Industries, which has no affiliation with this project.
OK, but what does it do?
You load an MP3, sync up some head, tail, and mouth animations, and upload it to the fish. Then (a bit later), you get a video of it. These aren't renderings or CGI/AI, it's real recordings of a real billy bass I've got set up. Each performance is filmed separately.
How?
When you are ready to record it on the fish, then
Processing time is usually about 2x longer than the length of the MP3. If you change the background color that increases to 3x.
I'm not reading all this, show me a video instead
How does it work?
I first tried to animate my fish manually in Audacity, on another audio track, by drawing in another waveform and trying my best to line it up with the words in the main audio track. Then I realized it would be a lot easier to just hook the mouth motor to a button I could press with my fingers in time with the music. I cobbled together this interface to basically be a digital version of that. Since I did it in a browser, I would then be able to put it on the internet after getting bored with the original project.
Actually, about 80% of what I needed was handled by this library I found called Wavesurfer. They even had a handy demo (markers) which implemented the above/below indicators I wanted. Of course, they then updated, changed the UI, and removed the Markers plugin (that's software for ya), so I use an older version.
If you wish to use the shoddy code of this animation interface for something else, be my guest.
The backend is a horrifying pile of python and a little PHP. This server takes care of hosting this interface, accepting uploads, and hosting the finished videos, but most of it is handled at the remote Raspberry Pi 4 which controls the fish.
On the Pi, I use a ramdisk to deal with the video stuff, so I don't have to deal with IO limitations on the SD card or worry about wearing it out. Add this line to /etc/fstab
:
tmpfs /ramdisk tmpfs nodev,nosuid,size=2048M 0 0
Recordings are made with a Pi Camera v2. Recording from it is a bit different than just hitting a button. First you record frames and timestamps, then assemble what you collected into a mkv container, at which point you have a normal H264 video, which you can downscale, chroma key, add in audio, or whatever.
$ rpicam-vid --level 4.2 --framerate 41 --width 1640 --height 1232 --rotation 180 --save-pts /ramdisk/video.pts -o /ramdisk/video.264 --denoise cdn_off -t {record_duration_in_milliseconds} -n
$ mkvmerge -o /ramdisk/video.mkv --timecodes 0:/ramdisk/video.pts /ramdisk/video.264
Recording are cropped and downscaled to 700x400, to save bandwidth. You probably don't need it to be full HD anyways.
I also use ffmpeg to do the chroma keying. From chatGPT and trial and error, this complicated command replaces the original blue background with white:
ffmpeg -y -i /ramdisk/video.mp4 -f lavfi -i color=c=0xffffff:s=700x400 -filter_complex "[0:v]chromakey=0x0126fe:0.20:0.05[ckout];[1:v][ckout]overlay[out]" -map "[out]" -map 0:a -t $(ffmpeg -i /ramdisk/video.mp4 2>&1 | grep "Duration" | awk \'{print $2}\' | tr -d ,) /ramdisk/output.mp4
The fish itself has its own microcontroller to handle the motors. I run them at a lower power when stalled (ie, when the mouth is held open) to hopefully reduce wear and damage. There is also a timeout enforced (of a few seconds) so that none of the motors ever get stuck active.
The animations are just sent to the fish event by event, in order, over the serial port, using python's time.sleep()
to do all the timing. It's not tied in with the camera at all, which also means that excessively long animations can drift out of sync with the audio. Luckily, this seems to be tolerable on the scale of a few hundred seconds or so.
Data privacy and such things
Look man, you're uploading stuff to a singing fish on the internet. I'm not sure what you're expecting.
Unless you leave me specific information in the notes box when you submit it, the only thing I can see is your IP and what you upload. No cookies, no accounts. I do keep everything that I receive, and I have copies of all the videos it makes. In an ideal case I'd like it all to be publicly available, but I can't do that because of 1) bandwidth, 2) copyright, and 3) the vile nature of the raw internet, which is the main reason you aren't watching this on Twitch.tv.
Recordings will hang round for 48 hours on this server before being deleted, but will otherwise still exist in my archives.
While I do encourage you to share the things you make with it, I will honor deletion requests. Please send inquiries and feedback to fishsinger@alnwlsn.com
.
Or, you can try my website's comments wall or the aformentioned thermal printer. I might have a better chance of seeing these since both display on deticated devices.
What does it actually look like?
Why did you make this / This is stupid
Because I can / you're stupid.
How does this even make money?
It doesn't, but you can donate if you want to keep it around. Should this become unexpectedly popular, I'll consider offering codes to skip the queue and/or full HD video downloads.