Chess Clock 2.0 using an Adafruit MacroPad

by MACE

I built a better chess clock. It’s no surprise that my first attempt (see http://blog.workshop88.com/2021/07/30/making-a-chess-clock-with-a-circuit-playground-express-user-interface-decisions/), while functional, was difficult to use. The buttons are tiny, you need a lot of prior knowledge to use it, etc. My wife refused to use it.

Just in the nick of time, I received the Adabox 019. It contained a keyboard circuit, keys, keycaps, OLED display, rotary encoder, encoder knob and housing — some assembly required. Adafruit promotes this product as a way to send commands through a USB port to the foreground program running on a computer. There is also a MIDI use case.

The included demo code interprets key presses based on a menu of items that are specific to a given program. You can have multiple menus for multiple programs. You switch between menus using the rotary encoder. Each menu can set the neopixels under the keys to visually group keys by functionality. Here’s an example menu I did for Inkscape.

Inkscape-MacroPad-Menu

Key 1 (upper left) resizes the document to match the selection
Key 2 is for Trace Bitmap
Keys 4-8 are for manipulating layers

It worked, but this use case for the MacroPad wasn’t satisfying. For me, a keystroke on the pad isn’t any better than just using the shortcut keys on the full keyboard. I soon started to hunt for a bigger itch that needed a bigger scratch.

The first one I came up with was to use the example menu framework to simplify my use of various Linux terminal commands. The ls command, for example, has dozens of options. The combination of dashes, case sensitivity, non-mnemonic codes, etc. makes the command tedious to type — especially if you need to use many different options in various combinations. Not to mention that the meanings of the options are different depending of the flavor of Linux!

Here’s the menu I wrote for the terminal on MacOS. I took better care on this menu to color code the keys. The yellow keys are for formatting options, the green key is for recursion, Key4 isn’t enabled, the red keys are for sort order and the purple keys are groups of file extensions. Pushing the rotary key enters the “ls ” command itself.

With an open terminal window, the MacroPad connected and my custom menu selected, I can use the keys to easily form the command that lists “code” files with units, long dates, sorted by size, in reverse order:

<Rotary><Key1><Key2><Key5><Key7><Key12> #These are the key presses

ls -h -T -r -S *.py *.sh *.sed #This is what gets typed at the prompt
-rw-r--r-- 1 appleadmin staff 75B Jul 9 09:40:36 2021 mymoduletest.py
-rw-r--r-- 1 appleadmin staff 72B Oct 19 09:59:35 2020 timezones.py
-rw-r--r-- 1 appleadmin staff 59B Mar 30 16:17:28 2021 getSkyCharts.py
-rw-r--r-- 1 appleadmin staff 58B Apr 2 16:39:43 2021 mymod.py
-rwxr-xr-x 1 appleadmin staff 46B Mar 31 06:38:26 2020 rhi.sh*
-rw-r--r-- 1 appleadmin staff 24B Aug 2 21:47:36 2021 goPad.sh

It works well.

Finally, having just finished my 2-Player+ Chess Clock, I made version V2.0 using the MacroPad. This time, I have a display screen to properly prompt for input and display time remaining, a rotary encoder with an “ENTER” feature for numeric value input and a physical, color-coded button for each player to push.

Here’s the result in action.

Making a Chess Clock with a Circuit Playground Express — User Interface Decisions

by MACE

We are makers. Some of us take an object and reshape it to make something new. Others of us assemble components together. For 50 years, I’ve made computer programs. And I still enjoy it.

In this post, I want to explain about the User Interface challenges I faced while programming an enhanced chess clock.

If you’ve seen the movie Queen’s Gambit (or just about any other movie with chess in it), you’ve seen chess clocks. Chess clocks are used to ensure games end in a reasonable time.

A chess clock comprises two, linked stop-watches. There are two buttons on top. The left button stops the left clock and starts the right clock. The right button stops the right clock and starts the left one. The clocks measure the total amount of time each player has spent thinking about and making their moves. Each player is given a certain amount of time at the start of the game. If a player hasn’t beaten their opponent before their time runs out, that player loses. There’s an indicator that shows which clock is running, and a little flag that drops to indicate that time has run out. In this day and age, digital chess clocks have often replaced analog clocks.

Now that looks like something makeable. A couple of 4-digit, 7-segment displays, a few buttons, a 3D printed case, some LEDs, a microcontroller — and a little bit of programming.

I took a different approach. I didn’t feel like buying and assembling the individual components. I wanted to write the code and begin using it very soon. Also, while chess is a two-player game, my family plays Rummikub which can be 2, 3, 4 or more players — so I wanted a variable number of clocks. I decided to use a Circuit Playground Express (CPX) from Adafruit.

The CPX is a microcontroller board that includes 1 red “status” LED (Pin 13), x,y,z-accelerometer, light meter, JST battery port, USB port for communication and/or power, temperature sensor, digital and analog pins, 7 capacitive touch sensors, 3.3 and 5.0V power and ground, two clicky push buttons, slider switch, infrared tx/rx, speaker, microphone, and (this is the best part) 10 RGB pixels.

It’s a great product to introduce people to programming. Programming can be done with MakeCode (a block based language), CircuitPython or the Arduino IDE. I chose to use CircuitPython and the Mu IDE to make my clock. When the CircuitPython firmware is installed, the CPX appears as a thumb drive called CIRCUITPY when plugged it into your computer. If you copy a file called code.py to the root of CIRCUITPY, the firmware starts to run it. The firmware senses whenever a new code.py overlays the previous one and auto restarts.

The challenge with this project was to design a user interface that can be used for a variable number of countdown timers using only the built-in components on the CPX. Here are the decisions I made.

How to tell how many players will be playing.
When the CPX is powered up, it immediately launches the code.py program which waits to be told how many players there are. It assumes at least two players, so it lights the first two pixels red. If there are more players, press and hold the A button until additional pixels light up. It’s possible to accidentally turn on more pixels than you intended. If this is the case, use the B button will turn pixels off. Code prevents you from setting fewer than 2 players. To finish up this step and move on to the next, tap the A1 capacitance pad. (My CPX is housed in a 3D-printed enclosure and the pads are a awkward to reach. To solve this, I clipped the head of an alligator clip to the pad. Rather than touching the pad, I can touch the tail of the alligator clip.)

How to indicate who’s turn it is.
With 2 players, each player gets allocated 5 pixels. Each set of 5 is a different color. With 4 players, each player gets 2 pixels. A player’s pixels will light up with appropriate color when their turn begins.

How to indicate that the clock is running.
I use the red status led. Each second, I toggle the LED.

How to end your turn, stop your clock and start the next player’s clock.
When the current player is done, they click their button – either button A or button B. When the next player is done, they click their button. Internally, players are numbered 0 to n. Even numbered players share button A and odd numbered players share button B.

How to indicate which player makes the first move.
Games don’t usually last a long time (although some players in my circle have historically taken a LONG time to make their move — hence, the need for this clock) and it’s common to play a second or third game after the first is finished.

For the first game, one of the players is designated the starting player (however that gets decided). That is player #0 and is assigned the first set of pixels and button A. The next player is assigned the second set of pixels and button B. This continues for all remaining players. When playing several games in a row, the starting player of the first game is not necessarily the starting player of the second game. For the second and subsequent games of the tournament, a new starting player must be indicated.

The number of players is the same, the order of players is the same, player numbers, pixels and buttons have all been assigned. None of that needs to change. But the clock does need to know which player will make the first move. The CPX prompts for the identify of starting player by lighting the 1st pixel blue. If the starting player is actually the 3rd player, holding the A button lights up the other pixels. Continue to hold until the starting player is indicated. Code prevents you from turning on more pixels than there are players and from turning off all the pixels. Tapping the A2 capacitance pad will lock that choice in.

When to start the first clock.
The number of players is set. The player to make the first move has been set. The next thing is to activate the clock of the starting player. This is done by covering the light sensor with your hand. When the sensor detects a significant change in brightness, it will start the countdown clock of the starting player, illuminate their pixels and blink the status led blinking.

How much time left.
In this first implementation, the CPX is connected to the Mu editor on a laptop via a USB cable which supplies power to the CPX. The Mu editor includes a serial monitor to which the CPX can write status information (like each player’s time remaining). Mu also has a built-in plotter. If a tuple is written to the console, the plotter will display the values of the tuple in a time graph. If you want to know how much time you have, just glance at the plotter (or the serial monitor). Here we see that player 1 (zero relative) has run out of time and gone negative.

A future implementation of the clock will be battery powered and there will be no plotter nor serial monitor. Instead, a single 4-digit, 7-segment display connected to the CPX’s I2C interface will display the time remaining for the current player. This exceeds the design goal of no additional components, but, what the heck.

What to do if the current player’s time runs out.
I play a sad sequence of tones on the speaker and light their LEDs red.

Here is a two-minute video demonstrating a four-player game with an artificially short time limit.

Yet to be programmed:
Set the length of time: Maybe light up the LEDs. The binary equivalent will be the number of minutes. This would put a 17+ minute limit for each player.
Indicate someone won (rather than someone timed out), game over, start next game. This could be done by shaking the CPX since there is a built-in cpx.shake() function that can be queried.

Laser Cutting a Jigsaw Puzzle

by MACE

My 2.5 year old grand-niece loves jigsaw puzzles. I thought it would be a nice to make a custom puzzle for her using a photograph of people she knows.

Here’s how I did it.

My approach was to take a photograph, glue it to a substrate and laser cut it into interlocking puzzle pieces. I’ve seen my niece assemble 25-piece puzzles, so I wanted to keep the total number of pieces to about that number. The pieces need to be an appropriate size to fit her little fingers. Combining this with the 25+/- constraint, I decided to order an 8×10 print and to cut it into a 5×4 matrix. Twenty pieces is a little light on piece count but it matches the aspect ratio of the photo.

Puzzle pieces need to be sturdy. Simply cutting the photo into pieces wasn’t satisfactory. The pieces need a stiff backing. I considered three materials — 1/16″ basswood sheet, acrylic, and heavy card stock. I ran test cuts on all three materials. I printed three 5×6 test photos and used Locktite 300 spray adhesive to glue the photos to each of the above materials and let them cure overnight. Then I covered the photos with blue painters tape to prevent charing of the photo. It’s best if the width of the tape is LESS THAN the size of a puzzle piece. The 300 adhesive helps ensure that the photo does not separate from the substrate either during the cutting process or when removing the blue tape.

The first material I tested was the card stock. I’d picked up some mat board typically used for picture framing. I cut a series of 1″x1″ squares through the tape, photo and substrate with various laser settings. The best setting seemed to be 10% speed, 100% VC, 100% power and 2 passes. The pieces cut very cleanly. The edges of the square were darkened, but the backside only had a little bit of soot and the blue tape completely protected the photo. The next test cut was a single, 2″x2″ jigsaw-puzzle-shaped piece. I found and downloaded a vector image of a single puzzle piece. It printed beautifully. The stiffness was perfect — just like a commercial puzzle — and fit well when dropped back into the hole left in the test material. I didn’t bother test cutting the other materials.

Finding a 5×4 vector jigsaw puzzle template for the production run was more difficult. I found plenty of .jpg images, but converting them to vectors for cutting exceeded my Inkscape skills. I ended up buying a collection of 13 templates in .svg format on ETSY for $2.50.

For the production run, I printed an 8×10 image and glued it to the mat board. In Inkscape, I opened the 5×4 puzzle template and sized it to 7.5″x9.5″ — slightly smaller than the image. I colored the interior lines blue and the outline red. The plan was to cut the blue first and the red last. This way, the pieces would stay together until the outer perimeter cut through. I added a second layer to the bottom of the layer stack, imported the original .jpg that I had printed and resized it to 8×10. With the two layers superimposed, I made whatever subtle changes were necessary to ensure the best placement of pieces within the boundaries of the photo. Once satisfied, I turned the background/photo layer off and sent it to the laser.

Surprisingly, the settings from the test cuts failed on the production run and I ended up using speed 30% and 3 passes. Once cut, I took each piece out of the laser one at a time, stripped the blue tape and reassembled the puzzle to ensure all pieces fit nicely.

When stripping the tape, there’s always the risk of pulling up the photo as you scrape along the edge of the piece trying to get a foothold to pull. By keeping the width of the tape small, you can ensure that the center of the piece with have two strips of tape overlapping each other.

It’s far easier to scrape at that seam to get started. Once the first one strip of tape is removed, it’s easy to start the second by scraping across the flat, center surface of the puzzle piece instead of along the edge of the piece.

Here is the final result.

I did eventually test cut the other materials. The 1/16″ basswood was sturdy, but a little thin. The acrylic was beefy thick, but light and strong.

Here are the details of the mat board purchased from Michael’s:

Can I get a hand here?

by MACE

I’m a new maker. It started simply enough – Raspberry Pi Zero W, DHT22 temp sensor, SD card, power supply. But it was not long before my appetite for new projects and more components grew. I bought LEDs, resistors, jumpers, DIP switches, camera, heat shrink, POTs, PIR, servos, steppers, … ENOUGH! I had a monkey on my back. But, what to do?

I eventually got to a point that the project I was building required soldering. Let me think — I’ll need one hand for the left wire, one hand for the right wire, one hand for the iron, one hand for the solder. A quick inventory of appendages led me to conclude that I needed an extra hand (or two).

Amazon to the rescue.

Now you’re talking. Clearly, I gotta have at least one of these. Right? Well, maybe not! I had just been in our laundry room and noticed we still had a box of wooden clothes pins. They’re kinda like the alligator clips on the Helping Hands. Surely they could be used to hold onto stuff.

But I couldn’t just lay the pins on the workbench. So I rummaged for a bit longer and found a discarded hinge from an old door. When opened to its fullest, the plate stands at a 50 degree angle. A little hot glue and I had all the helping hands I needed — for next to nothing.

How about that! I made, not bought, something to help me make something else.

Here’s the bottom line. Do I need a Helping Hands? No, not right now. Am I going to buy one? HECK YEAH — that thing is awesome looking.

Hacking a toolchain to make Atari 8 bit YouTube Mandelbrot Zoom Videos

In a previous blog post I explained all about the color cycling Mandelbrot Set explorer I wrote in 10 lines of Atari Basic for the BASIC 10 Liner contest (Second place!). While working on this project I thought it would be really cool to create a Mandelbrot Set Zoom video rendered on the Atari, which led me on an interesting journey…

I wanted to make something like this deep Mandelbrot Set zoom video

Or this video I found later that animates the color palette while zooming.

The Atari BASIC Mandelbrot Set renderer is not able to execute millions of calculations for every pixel, as a matter of fact it is configured to execute 81, at that depth the numerical precision of Atari BASIC’s floating point numbers appears to start breaking down, and with it’s 1.79MHz 6502 CPU, more cycles would start to take a very very long time.

The program can zoom in and out on 12 interesting preset locations on the complex plane of the Mandelbrot Set . I realized that if you captured a screen at each size I could scale the bitmap and make an image that had very high resolution (small pixels) in the center, and then could zoom in on it. I also thought if I captured video of the color cycling I could scale and synchronize the videos to zoom in while color cycling, and by offsetting the color cycling by a fixed amount with each zoom level I could create synchronized color coded animated frames that better showed the zoom levels.

Having a concept or idea is one thing but the project was more complicated than expected and the final implementation involved virtual machines, emulators, hundreds of lines of code, a convoluted still and video tool chain, image, and video editing tools – entirely with free and open source software. It the end it is more of a convoluted hack than a toolchain but it worked and that’s what counts!

I won’t keep you in suspense forever

Here are 6 of the 24 resulting Atari BASIC Mandelbrot Set zoom videos, in the ones with frames each colored frame is half or twice as large as adjacent frames:

Mandelbrot Zoom Videos

Videos with colored frames

Here are complete YouTube playlists of all 24 of the Mandelbrot Set zoom videos: 12 locations rendered without and 12 with colored frames. The first 3 videos in each playlists are the ones above, so if you’ve already seen them you may want to skip ahead to see the remaining 9 videos in each playlist.

I reached out to fantastic Atari chiptunes artist Adam Sporka and he generously offered to share his music for use in these videos which is done on a real Atari 800 . I think it is totally awesome that all graphics and audio were created on Atari 8 bit computers.

Here’s how I did it…

This blog isn’t necessarily a “how-to” or “follow along” article as I assume you don’t need to make color synchronized videos from Atari BASIC, but I detail the challenges, remedies (hacks), and free and open source tools used to solve the problems encountered along the way and generate the final videos. I hope the troubleshooting strategies, solutions, and tool details are interesting and helpful to you.

In the Atari800Win-PLus emulator I ran the program, visited each location, zoomed all the way in and all the way out, each time taking ~30 minutes to render and then recording a ~20 second video. The 297 videos took most of a weekend to render and capture. I ran the emulator in 12 Windows XP Virtual machines in VirtualBox on my (KDE Kubuntu) laptop so was able to keep the process moving. This animated .gif was captured from the desktop using Peek while I was doing these captures.

When capturing video, the emulator has a limited set of video codec’s to choose from: Cinepack Codec by Radius, Microsoft Video, Microsoft RLE, and Intel IYUV. When testing, Cinepack did not seem to be compatible any more, and I decided to use Intel IYUV format because it was compact, and looked good. I did run into a horrifying problem: After all the video was captured I looked at in in VLC on the laptop and bizarrely the video was mirrored. This appears to be an issue with the Linux version of the playback codec, and to my relief did not pose a problem in Windows.
(* WHEW! *)

I used folder sharing in VirtualBox to capture the videos from all the emulators running in the virtual machines into a single shared folder, and then I copied them all to a folder shared on my Windows machine (via Samba).

Each of the 12 locations had several videos with numbered filenames like “Thorny Lightning 0.avi”, “Thorny Lightning in 2.avi” or “Thorny Lightning out 3.avi” where the “in” and “out” indicated the number of times it was zoomed from the default zoom level named “0“.

Catastrophe!

I thought I could use video editing software to simply composite the Mandelbrot Set zoom but I was wrong, very wrong. I though I would manually set the in and out point in each of the 297 video clips in video editing software, align them on the timeline, set some zoom interpolation, and voila! … but when I tried a test sample I discovered two massive problems:

  1. The video zoom effect is not linear, it is exponential, it doubles in size at regular time intervals. This is something that is much harder to do in video editing software, and is VERY hard to synchronize across several aligned composited color cycling videos.
  2. The color cycling was not synchronized between videos. Even though I could perfectly match the first and last frames of a color cycle sequence in each video and scaled the videos on the timeline to the same length, the middles of the color cycling sequence were often out of sync leading to flickering that ruined the quality of the video (and yet gave me the idea for the colored frame version of the videos).

Handling zoom rate (aka scripted scale and compositing hack)

The only way I thought to handle the zoom issue was to programmatically composite the frames. This would require exporting the frames from the .avi files, identifying the synchronized frames from each of the source avi’s, and compositing them together so all of the images were scaled and registered to one another perfectly while zooming exponentially.

For processing I extracted all the frames of each .avi video into numbered bitmaps in a _frames folder using FFmpeg. To do so I used an hybrid manual automated process, which is a fancy way of saying I kept editing a batch file until I got everything I needed. The batch file basically looked like this:

for /r %%i in ("..\Thorny Lightning*.avi") do (
ffmpeg -i "%%~pi%%~ni.avi" -filter:v "crop=320:192:10:24" "_frames\%%~ni %%04d.bmp"
)

This would process all the .avi files from a particular location, in this case “Thorny Lightning”, using a wildcard in a for loop. The script calls FFmpeg once on each .avi file that matches the wildcard, inputs the .avi, crops the black overscan border from the images and saves them as numbered bitmaps in the “_frames” folder. After processing all 297 videos I have 413,153 numbered .bmp files (70GB).

To composite the frames in an exponential zoom I wrote a Python script that uses ImageMagick to scale and composite the source .bmp frames into zooming video clip frames. The scaling ended up being simpler than expected. Since the image grows geometrically with time, the scale based on time t is 2t and since each video is half the size of the next largest I could divide the dimensions in half for successive frames, all centered at the center of the image. For each output image the program generates command lines similar to this one to execute ImageMagick:

magick convert -size 320x192 -gravity center ( "_frames\Thorny Lightning out 15 0133.bmp" -sample 329x198 ) ( "_frames\Thorny Lightning out 14 0144.bmp" -sample 165x100 ) -composite ( "_frames\Thorny Lightning out 13 0120.bmp" -sample 83x51 ) -composite ( "_frames\Thorny Lightning out 12 0140.bmp" -sample 42x26 ) -composite ( "_frames\Thorny Lightning out 11 0109.bmp" -sample 22x14 ) -composite ( "_frames\Thorny Lightning out 10 0116.bmp" -sample 12x8 ) -composite ( "_frames\Thorny Lightning out 9 0130.bmp" -sample 7x5 ) -composite ( "_frames\Thorny Lightning out 8 0128.bmp" -sample 4x3 ) -composite ( "_frames\Thorny Lightning out 7 0121.bmp" -sample 3x2 ) -composite ( "_frames\Thorny Lightning out 6 0120.bmp" -sample 2x2 ) -composite ( "_frames\Thorny Lightning out 5 0129.bmp" -sample 2x2 ) -composite -crop 320x192+0+0 +repage "Thorny lightning\frame_ioi_00000005.bmp"

This line makes a 320×192 bitmap composited of 11 scaled source bitmaps. Finding the right settings to get ImageMagick to composite the videos the way I wanted with cropping and point sampling (instead of anti-aliasing) was a challenge. It is a very powerful tool that can process images in a multitude of ways, often offering many ways to accomplish the same or similar results (ImageMagick reference).

Color cycling synchronization (aka manual image index hack)

The source color cycling videos were manually captured and contain more than a single clean synchronized color cycle loop (extra frames at the beginning and at the end of the video), so I manually looked at the bitmaps to find the index of the first and last frames of the color cycle and included those into the program to create a test video. The resulting video finally zoomed perfectly but was still ruined by flickering due to the un-synchronized color cycling.

Digging a little deeper…

Atari BASIC is not frame synced, meaning it does things continuously and without synchronization with the TV display signal. The Atari computer emulator on the other hand renders video at 60Hz, one frame for every NTSC field emulated.

I started performing analysis of the bitmap sequences in Python using Pillow (fork of the Python Image Library or PIL) for image processing. Initially I tried to detect the number of colors in an image to decide when the color cycling was changing but that was vexed by the scan lines effect I had enabled in the emulator which caused anti-aliasing of some of the lines and a lot more than the 9 Atari graphics mode 10 colors I was expecting (I still think it looks cool). While performing additional experimentation I noticed that successive video frames were changing while color cycling was copying one register to the next and calculating a new color, but there were many duplicate frames while the rest of the Atari BASIC program executed its processing loop. Additionally I was aware that at one point in the color cycling all of the onscreen colors would be gray scale (actually close but not perfectly RGB gray).

The winning strategy (aka image processing frame sync hack)

I scan each bitmap sequence from a video from the beginning to find the first frame that is entirely gray scale, then I scan from the back for the first frame of the same gray sequence at the end, this defines the range of frames for one full color cycle. Within that range of frames I compare each frame with the next until I find a set of duplicate frames, and store the first duplicate frame’s index. There are exactly 128 frames in a color cycling sequence: 8 brightness cycles of 16 hues. Adding a multiple of 8 will synchronize brightness but will offset hue, this is used for the colored frames effect. When the process is done it should have discovered the exact 128 frames representing one color cycle for that bitmap sequence (from the .avi) at that zoom level.

More complexity

This almost worked perfectly, except Atari BASIC is not perfect and I am not perfect. Atari BASIC would occasionally let an extra duplicate frame occur, and I occasionally recorded two or three color cycles that would result in 256 or 384 (or more) frames instead of the expected 128 frames: in either case it messed things up.

More fixes (aka extra images Gimp hack)

The unwanted duplicate frames were pretty easy to find; usually duplicate frames were 9 or 10 frames apart, but when an extra is inserted, they will be only 4 to 6 frames apart. When the program detects anything other than 128 frames in a cycle it dumps all the frame offsets and the number of frames between offsets. I manually tracked down the extra frames and defaced them in Gimp making them no longer duplicates. I re-recorded the videos I screwed up (and deleted the bitmaps I made, and made new bitmaps with FFmpeg), and was back in business.

What now?

By now Adam had agreed to share his music and I had to figure out which songs I thought matched with each video. To do that I created some more batch files this time using FFmpeg to create .mp4 video files from the exported .bmp frames and the .mp3 music Adam provided. I used command lines like this to generate test videos with music:

ffmpeg -y -r 60 -i "Thorny Lightning\frame_ioi_%%08d.bmp" -i "music\Stack 'Em up Adam Sporka.mpeg" -c copy -map 0:v:0 -map 1:a:0 -r 60 -c:v libx264 -pix_fmt yuv420p -preset slow -crf 19 -vf scale=iw*4:ih*4 -sws_flags bitexact "_out\Thorny Lightning ioi x4.mp4"

This reads the bitmap sequence at 60fps and the music Adam provided, mapping the bitmap sequence channel 0 to video, and the music channel 1 to audio at an output framerate of 60fps using the h264 codec with very low loss, at 4 times the resolution (if I uploaded the video to YouTube rumor has it increasing the resolution will increase the quality) with the bitexact flag telling it to resample the image rather than use a smoothing enlarging filter.

Here is a similar simpler command line to generate only a video with no sound.

ffmpeg -y -r 60 -i "Thorny Lightning\frame_ioi_%%08d.bmp" -r 60 -c:v libx264 -pix_fmt yuv420p -preset slow -crf 19 -vf scale=iw*4:ih*4 -sws_flags bitexact "_out\Thorny Lightning ioi q x4.mp4"

These were excellent tests to adjust the color cycling and zoom rates and see which of Adams songs matched 12 Mandelbrot Set locations, but the videos had no titles, credits, or transitions. They were previews but they lacked finished polish.

Are we there yet? (aka final composition)

I used HitFilm Express to edit the 12 video projects and render 24 videos. I imported the bitmap sequences as clips (didn’t use the preview compressed .mp4’s), created intro and outro titles, composited the video with the awesome music provided by Adam Sporka, and synced up all the fades. I exported two versions of each video, one with colored frames, one without. Then uploaded them to YouTube, made the descriptions, and end cards, and etc. You know the rest.

In conclusion

In the end I love the way the videos turned out, it’s amazing to think that everything you see and hear in these videos was created on Atari computer technology that was invented before Benoit Mandelbrot visualized the Mandelbrot Set at IBM March 1st, 1981. The Atari BASIC script that rendered every frame was only 10 lines long but it took a lot of creativity to hack together a free and open source toolchain involving VirtualBox running 12 Atari800Win-PLus emulators, half a dozen batch files, FFmpeg, ImageMagick, Gimp, 560 lines of Python using Pillow, and finally HitFilm Express to generate the final videos.

Stay creative, and support your creative communities!

Maker Meeting Short Take – Schlieren imaging!

Check out the latest in our series of outtakes from Workshop 88 Maker Meetings here:

See the rest of the discussion in the original video

The reason behind using the Schlieren system was to find out what we could see while blowing over the top of a plastic pop (soda) bottle. There was quite a bit of discussion and suggestions for experimentation in the original Maker Meeting video.


Workshop 88 is a makerspace in Glen Ellyn Illinois. We are more than a workshop, we are a growing community of creative talented people who aspire to learn and share knowledge, experiences, and projects.

Join us! To become a member join at Workshop88 or you can help us continue to share our projects and activities by supporting us via Patreon.

Never miss a tip or project! Follow our blog at www.Workshop88.com, subscribe to Workshop88’s YouTube channel, like us on Facebook, follow us Twitter and join or support our maker community by contributing to Workshop88 on Patreon!

To find out about upcoming events follow Workshop88 on Meetup.
Have a question? email us at info@Workshop88.com

Maker Meeting March 23, 2021 – ESP8266, Arduinos, and a Dalek

Check out the March 23, 2021 Workshop 88 Maker Meeting here:

ESP8266 interfaced with a keypad

Peter shared the following information about using the Arduino keypad library with the ESP8266after the recording:

There is a pretty good tutorial at https://diyi0t.com/keypad-arduino-esp8266-esp32/

There were two questions I was not able to answer, but I’ve done some more investigation:

  1. If you register a callback function that is invoked when a key is pressed or released, is that based on interrupts?   No.   It ties into the non-blocking getKey() function which is already being called each time through the loop.   If a callback is registered, it will be called by getKey() when appropriate.
  2. Does multi-key support mean that it buffers up a sequence of characters, or does it mean you can press multiple keys (chords) at the same time?   Despite my wrong guess last night, it’s the latter.   The getKeys() function will return a list of up 10 keys that are pressed or released, even if a second (or later) key is pressed before the first ones are released.   Apparently, they do the right magic with pull-ups and only driving one line at a time so that they can do this without diodes on the switches.

Other items shared!

After Peter’s presentation, Jim shared with us how he uses an ESP8266 to control an Arduino. Very cool! At the end, Dave shared with us his progress working on a Dalek build project – including some very nice resin casting!


Workshop 88 is a makerspace in Glen Ellyn Illinois. We are more than a workshop, we are a growing community of creative talented people who aspire to learn and share knowledge, experiences, and projects.

Join us! To become a member join at Workshop88 or you can help us continue to share our projects and activities by supporting us via Patreon.

Never miss a tip or project! Follow our blog at www.Workshop88.com, subscribe to Workshop88’s YouTube channel, like us on Facebook, follow us Twitter and join or support our maker community by contributing to Workshop88 on Patreon!

To find out about upcoming events follow Workshop88 on Meetup.
Have a question? email us at info@Workshop88.com

Maker Meeting March 16, 2021

Check out the March 16, 2021 Workshop 88 Maker Meeting here:

Things We’ve Made

3D printed for everyday use. A place for everything and everything in it’s place. (Organizers)

Vase Mode – extra thin walls

Modular Tool Caddy

Replacement Parts – Repairs around the house by Peter

Thingiverse Links

Power Strip Project (Slide Deck)

An approach to problem solving

Designing 3D printed parts to help supplement a situation.


Workshop 88 is a makerspace in Glen Ellyn Illinois. We are more than a workshop, we are a growing community of creative talented people who aspire to learn and share knowledge, experiences, and projects.

Join us! To become a member join at Workshop88 or you can help us continue to share our projects and activities by supporting us via Patreon.

Never miss a tip or project! Follow our blog at www.Workshop88.com, subscribe to Workshop88’s YouTube channel, like us on Facebook, follow us Twitter and join or support our maker community by contributing to Workshop88 on Patreon!

To find out about upcoming events follow Workshop88 on Meetup.
Have a question? email us at info@Workshop88.com

Maker Meeting March 9, 2021

Check out the March 9, 2021 Workshop 88 Maker Meeting here:

HackADay

Basic (Code) in 10 lines or less contest

Scott’s project overview of his contribution to contest.

Mandelbrot

Must see to enjoy. It’s mesmerizing.

TinkerCad Follow Up

  • Logos
  • Shapes Collection

Around the House Repairs

Bob showed what he 3d Printed to replace a broken part around the house. Next week, bring examples of things you made (instead of bought) and used around the house.

Misc Discussions

Acetone Smoothing of 3D Prints

Ferrofluid (https://en.wikipedia.org/wiki/Ferrofluid)

MandelBulb (https://en.wikipedia.org/wiki/Mandelbulb) Ryan Bliss (Digital Blasphemy)

3D Printer Hardware


Workshop 88 is a makerspace in Glen Ellyn Illinois. We are more than a workshop, we are a growing community of creative talented people who aspire to learn and share knowledge, experiences, and projects.

Join us! To become a member join at Workshop88 or you can help us continue to share our projects and activities by supporting us via Patreon.

Never miss a tip or project! Follow our blog at www.Workshop88.com, subscribe to Workshop88’s YouTube channel, like us on Facebook, follow us Twitter and join or support our maker community by contributing to Workshop88 on Patreon!

To find out about upcoming events follow Workshop88 on Meetup.
Have a question? email us at info@Workshop88.com