Jan 182020
 

They’re warp coils now, not just tubes filled with stuff to create a lighting effect that kind of reminds me of warp coils- they’re actual warp coils.

There was a disappointing development with the test rig. A few days after I made it the silicon curing in the tubes shrunk up inside it and left a big air gap that screws up the light transmission around the tube. It stil looks okay but you see a lot of light blasting out the back that wasn’t there before.

I thought about trying again but filling up the printed holder with silicone too when it shrinks up it pulls in more silicone, but I’m not sure that would work or would be worth it, and it’s messy enough as it is.

The more obvious solution is just fill the tubes and let them cure, then cut the tubes down where the silicon shrinks to and jam that into the LED. It’s not as ideal as letting the whole thing cure in one clear continuous mold, but I’ve noticed the interface between the LED and tube filling isn’t as critical as I thought. A perfect seal with the silicone seems only about 10% better than just jamming the silicone in after its cured. Obviously that’s just me ballparking it based on observation so who knows. I think if there is something to be gained in a perfect interface between LED and waveguide then it’s so unattainable it’s not worth the effort past just making sure they’re pretty well crammed in there with no big bubbles or gaps. But since I had the silicone out I figured I’d try to get creative and tried a few things:

~Air: Probably should have started with this but for some reason I never tried it. The fact that it doesn’t work doesn’t excuse the fact that I should have tried earlier.
~Water: Sealed water in the tube with silicone.
~Fluorescent Water: Soaked a yellow highlighter in hot water and sealed that in the tube with silicone.
~Water So Fluorescent it’s Opaque: Soaked 5 colors of highlighter in hot water… more is not always better.
~Silicone: Clear silicone again. This time cured and cut before installing.

In the ring tower from bottom to top it’s air, yellow fluorescent water, and silicone.

The water tube leaked a bit when bent into a circle so I put it on the older tube tester. The fluorescent cocktail is just kind of gross. Maybe a lighter solution might work better, but I think I’m just going to count that as a total fail and pretend I never did that.

~Air: Apparently air is worthless, it obvious doesn’t refract light enough. There’s no value to air whatsoever, never use air for anything.

~Water: Water doesn’t seem to work as well as silicone, but it’s not a fair test because I’m guessing how it would look with both ends lit. I don’t think water has any advantages for the warp coil effect. It’s an equivalent or greater messiness and pain-in-the-neck factor with silicone and doesn’t work quite as well. Working with water adds the possibility of a spill killing the board, but silicone actually waterproofs everything. Also the interface between LED->Silicone->Water is probably why it doesn’t work, and it feels like it kind of defeats the whole purpose. The only thing I can think to use is that it’s a liquid so could be used for snow-globe particle effects stuff. If I could keep the bubbles small it might be interesting, but any air forms one bubble and does this action here. Kind of cool how it creates the swimming pool lighting effect when it reaches the LED, but idk how to use that right now.

~Fluorescent Water: The yellow is an interesting effect. I think I’m seeing a bit of a luminance spike in the yellow\red range but I’m not sure how much of that is really UV being converted to visible light or how much is just the tint of the dye. In any case I don’t think it’s really worth the effort for this, but something to think about for special applications. The effect is only really dramatic when you compare under reflected light like these photos with the camera flash on and off:

Flash OFF
Flash ON

~Silicone: Just looking at the tubes it seems like silicone has the highest index of refraction. I think that’s a desirable characteristic for waveguides but I’m not sure to what extend it matters since I’m doing zero calculations and this is really just artistic diffusion and I’m not even sure if I’m using the term waveguide correctly in this context. Anyway it seems like silicone is the highest and that’s the best. Also it clearly just looks better in the rig to me. Only problem is now when I look at them I can’t stop thinking they’re shower curtain rings.

And since silicone is clearly the superior material I went ahead and cut down the rings from the first test and jammed them back in the LEDs. The rings are a little small now but I’ll remake it all later. But here’s a video of the rig with the cured silicone and slightly modified demo code including an actual warp coil effect for these actual warp coils.

Jan 182020
 

My door has a crazy turret peephole thing. It belongs in an arctic research submarine. It’s huge and it rotates. Pretty sure it was designed by Carl Norden.

The Norden Door Sight

So I wanted to make a little peephole camera for it because it’s 2020 and I own a 3d printer and play with cheap microcontrollers with built in camera’s so why the hell haven’t I done this already?

I didn’t have a good answer for that question so I did this:

And I put it here:

And now I can see this:


And FYI this ESP32-CAM is running the esphome firmware and integrated into Home Assistant, which are two of the most awesome things I’ve encountered lately.

I won’t bother posting the 3D files or esphome yaml unless someone asks. The 3d files are just a crappy case I stole off thingiverse and modified to secure it to the end of a cylinder, and the yaml is straight from esphome examples. I’ll post some of the yaml I’ve made for servo controls with ESP32-CAM\Lolin32 boards a bit later, still working on making servo movements a little smoother. I’m having to relearn simple stuff like incrementing values in loops in yaml. Sadly literally none of my arduino code is remotely portable for any of this.

Jan 112020
 


So here’s a little test of a ‘waveguide’ effect for LEDs.

This uses 5/8″ tube filled with clear silicone and I think it worked out pretty well. The video doesn’t capture the effect perfectly, it’s a little blown out, but you can get the idea that the light is being carried around the ring and looks a bit like it’s fluorescing from the liquid.

I started thinking about this awhile ago and finally got around to putting it all together. I initially started by trying to do tests with little sections of tubing attached to LEDs. Tried a few kinds of caulk and silicone but the only thing that really popped was the super clear silicone.

There’s a few tricks to the process. You have to fill the tube without any bubbles, which takes a bit of practice, patience, and elbow grease with a caulk gun. Then you have to smash the tube over the LED with enough excess to displace air and not allow any bubbles to form in between the LED and the tube, which blocks the light and dims the tube a lot. And you have to secure the tube to the LED to allow it to cure.

The simplest way to do all this was to just design and print the connector I had in my head for if the lighting effect worked. I made one section of tube and it looked pretty good, so I made this one with three sections.

The back ‘spine’ is hollow with a WS2812 strip doubled over so there’s an LED pointing left and right into each section of tube. The ‘waveguide’ carries light a few inches, enough for several configurations, but it can’t really replace EL wire or anything.

I didn’t include any STL’s or code because the silicone and acrylic is the trick, everything else is pretty simple to reproduce if you do this kind of thing, and if you don’t you should do that first before you start getting covered in silicone while playing with electronics.

Not sure why the video isn’t embedding but here it is.

https://youtu.be/fN1kAN9xcQ0

Jan 102020
 

This is an old ‘hearing test’ box that I gutted and used for various projects. This time it’s filled with older raspberry pi’s. I mainly use zero’s now, but wanted to make a sort of functional retirement home for my pi’s. This rig has:
(1) Raspberry Pi B (Original 256Mb RAM)
WiFi Bridge to LAN
(3) Raspberry Pi 2B+
~WebCam
~LCD
~RTL-SDR
(1) Acer Netbook running Raspian x86
~Interface
(1) 5-Port Ethernet Switch

I don’t turn it on much, but it’s a nice little package whenever I need to haul it out for something.
Figured it deserved a post so here’s that.

Aug 132019
 

I’m getting back around to some of my electronics stuff and I was discouraged that the Lolin32 I was using had been discontinued. In retrospect I probably should have used ESP8266 as the basis for WiFi LED control, but meh.

So anyway, I recently found a new board to get excited about, plus a few battery modules that haven’t actually come in yet. The ESP32-CAM has most of what I’d want in a board for simple bots; camera, wifi, and enough GPIO to control a few servos and an onboard flash LEDs. I got my first one in last weekend and spent it trying to get the basic functions for a FPV bot down.

Took me a while to cobble together the functions I needed from available docs and example code, but I’ve got a sketch that does the following:
~Connect to WiFi (SSID\PW hardcoded for now)
~Start camera stream
~Start UDP listener, parse and process incomming packets
~Servo motor control (2) currently
~LED control, onboard RED (ON\OFF) and WHITE (PWM)

Here’s the code I’ve got so far- be warned; like all my code- it’s a nightmare, but it works for me. I left some notes and links to other code I scavenged, but there’s a lot more to do and I honestly don’t even get how the camera stream is working in this sketch, but it does, so good enough for now.

The biggest downside of this board for me is it doesn’t have a built in USB interface so I have to use TX\RX pins to load code, but that’s not too big a deal. Though since only have the one ESP32-Cam at the moment and I’m leaving that one the breadboard for testing. I’ve ordered a few more since they’re so cheap- next one I get will go on this little robot platform I found online and printed. I actually found this design online some time ago and saved the files, but now I can’t figure out where I found it, so if anyone knows this design and has a link to it please let me know.

Hopefully I’ll post more because hopefully I’ll do more, but this is something and I figure if anyone else out there is messing with an ESP32-CAM out there they may find this marginally useful. If by some chance you are working with this board feel free to contact me, it’s got some great potential.

Aug 282017
 

IMG_20170827_201722IMG_20170827_201832

So I’ve been using Arduino Nano’s for a long time and my communications were pretty limited to
bluetooth and IR. I always wanted to use WiFi connection, but the expense and effort to integrate WiFi with Arduino was never really worth is. Enter the ESP8266- I actually made a few attempts to use it as a drop in replacement for the bluetooth modules I’m using, but I never got anything working the way I wanted.

So then I start seeing these ESP32’s and they look pretty sweet, but I wasn’t sure I wanted to take the leap when I was so comfortable in my little Nano world.

Then I see the Lolin32 ESP-32 board, with one major thing that set it apart- the battery connection and charging circuit. Game changer for me. I’ve been using USB battery packs forever because they’re pretty self-contained and nothing I could hack together would be as good as just plugging in the USB, but now I’ve got options.

So I got a couple of these Lolin32’s and I’ve had about a week to poke at them. I finally got a few of the puzzle pieces turned over so I think I’m ready to start redoing the old bluetooth LED FX stuff for the ESP-32.

So this post is mostly just to make notes and such so I don’t forget and maybe they’ll help someone else trying to get started with these chips.

Here’s some pics of my ‘first light’, got a little strip of LED’s controlled via a simple web server on the Lolin32.

NOTES \ STUFF I’VE FOUND SO FAR
~FastLED library isn’t there yet. They just updated it with ESP32 support but I still couldn’t get it working so I ended up using Adafruits Neopixel library. There are still some weird flickering and color issues, but for the most part the library works okay.
~Pin 0 is weird. Not sure what’s up with that but when you have anything connect to it the Lolin32 boots into a different mode so idk if you can use this pin as a normal GPIO

TO DO \ STUFF I PLAN TO DO LATER
~Going to figure out more about the deep sleep mode on these things
~Not sure if there’s a way to monitor the attached battery voltage, but that would be great
~Still haven’t tried servos.
~Something about capacitive touch buttons built in.

Dec 232016
 

So I figured I’d start posting a little bit about what I’m up to arduino\blender-wise, mostly to try to get back in the habit.

CLOUDY VISION
Been working on a loose idea about some kind of robotish platformish kind of thing. I keep calling it robot rugby but I’m not really sure what that means at this point. I envision some kind of arena with about 4 simple bots trying to move around a puck or ball mostly by pushing it. The controls will be a closed-loop feedback from an overhead camera that can monitor each bots position and do some basic path planning.

SIMPLE BOTS
The bots are super-simple, not even bots really, just remote control things because they really don’t have any sensors of their own. I want to utilize cheap, common parts and this is what I have so far for the ‘v1’ bot. The basic capabilities of these bots are:

-motors for mobility
-led’s to visually ID each bot via color patterns
-control receiver
-battery powered and able to ‘self charge’ when low

The parts list so far is pretty simple
(1) 9g Servo – continuous rotation
(1) 2.5g Servo – standard 180deg
(1) Arduino Nano
(1) USB Battery Pack
(3) RGB LED’s
(1) IR Reciever

To get started I went with a differential drive but I’ve changed my mind on that. Here’s the differential drive version:
botv2-img_20161120_215020_lrThis is basically another iteration of ugbot and tweedle, but actually way simpler. I’ll probably end up calling them Tweedles though, or 790’s, and if you get those references we should hang out. And here’s the breadboard testing rig I made early on to just get all the wiring and code set up before I soldered anything.botv2-img_20161222_211927_lr

I decided to go with a ‘car drive’ for several reasons. It takes the weight off the servo drive, and I think it’s more efficient to have one drive motor instead of two, plus it means I only need one 360deg servo and those are slightly more expensive or I’d have to mod a regular servo and that’s a pain. I’m still honing the design but I’ve got a basic transmission and steering system made from little 3d printed parts. Here’s the first test print of the transmission. It’s super-simple, gear ratio is 1:1 and it’s strapped on with zip-ties. I was going to glue it but the zip ties magically worked so I think I’m going to plan on using those in the next iteration of the print.

botv2-img_20161218_191906_lr botv2-img_20161218_191924_lr

 

 

 

 

 

 

 

 

 

CONTROL
I decided to use IR control because it’s by far the cheapest as far as parts go, talking about less than a dollar compared to BT, RF or WiFi which can get to several dollars and adds a lot of unnecessary complexity. Though I probably will play with those ESP8266 WiFi modules at some point, I’ve already ordered a couple to mess with since they’re only a couple of bucks.

I don’t really need manual control for the ultimate goal of this, since this is really all an excuse for me to play with computer vision and visual servoing concepts and code, but it makes sense to create a manual controller just for testing.

I wasted a lot of time screwing around with a ‘funduino’ board, but it ate up too many pins and one of the buttons never worked so I got frustrated with it. At some point I realized I had a wii nunchuck and a wiichuck adapter already, so I put them to use. The wiichuck is I2C so it only uses 2 data and 2 power pins, way better than the 8 or so pins the funduino required. I wish I’d thought of that before I wasted like $20 on the funduino and an arduino uno to use with it. Here’s the wiichuck IR control rig:

botv2-img_20161222_211845_lr

 

IR CODE
I had no idea what I was getting into. I’d played with RF and BT modules before, I thought IR control would be simple, and it kind of is, but it’s also kind of a nightmare.

A little backstory on the IR thing- I’ve also been kind of wanting to control flying things via computer vision for a long time- specifically these little cheap IR controlled helicopters. Quads are fun too, but the tiny ones are really hard to control, have terrible battery life, and are more expensive than I’d like. The little IR helicopters are surprisingly easy to fly, cheap, and have decent battery life for a flying thing. Going with IR for the bots was a dual purpose decision, it was cheap for the bots, but it also could lead into controlling the helicopters too.

I started to look into LiRC, but honestly I just got confused, I don’t think that was meant for what I’m trying to do. I slapped and IR LED onto a Raspi and messed around with it awhile, got it to send a few codes, but I just didn’t like it, so I went back to the arduino to actually fire the IR LED. I figured I’d set it up so the PC sends serial commands to the arduino and it fires those off in IR language. I didn’t realize IR language was such a pain in the neck to learn and speak.

I won’t go through the whole process, a lot of it involves me being an idiot, making stupid assumptions and spending long hours learning how stupid those assumptions were and why they were stupid. This is one of the tools I made to help me stop being stupid. It’s just an IR receiver and an OLED screen I thought I could use to analyze IR codes. It really hasn’t helped that much, but I do like these little OLED screens a lot now. Here’s that thing:botv2-img_20161222_211941_lr

Controlling the bots turned out to be fairly straight-forward. I just used the NEC protocol to send 6 byte chuncks that encode the bots motor states and ‘ID’ (because I expect to be controlling multiple bots with one IR LED), and another byte in case I need it, maybe for RGB LED states or something. It actually wasn’t as straight-forward as all that because the NEC has these repeat code that are just weird and I didn’t know they were happening at first and thought the receiver was junk or something.

I thought controlling the helicopters would be a matter of pointing the remote at my receiver and reading the codes and figuring out the protocol. That didn’t work. The arduino IR library I was using (IRLib2) had no idea what those codes were. Long story short I finally broke out my little oscilloscope, made a cheap-o IR probe from an IR LED and a broken headset jack. I’m still figuring out the exact protocol these things use but at least now I can see the actual pulses and decode them myself. So far I can get the helicopter to turn its light on and off but even that doesn’t always work. Here’s my trusty gabotronics xprotolab portable with the probe and the helicopter controller- pretty much the best kickstarter thing I’ve backed. This thing is great, and it was like $80 through the campaign.
botv2-img_20161222_212053_lr

WHY NO CODE YET?
Because at the moment it’s not worth posting and I am surprisingly lazy. I’ll post print files and code when there’s anything really worth posting, but if by some bizarre chance you’re working on something similar and give no flips about the state of the code I’d be happy to share.

LATER
I have yet to actually control a robot via computer vision, but that’s part of the long term goal of this. I have played with a little of the code required just to see what I’m getting into. I downloaded VISP Visual Servoing Platform and went through a few tutorials. I realized pretty quick it wasn’t what I needed, or it was a lot more than I needed. It also meant I’d be coding in C++ and I was okay with that but it wasn’t my preference. I realized most of the functions I needed where actually OpenCV so I just went to that, and I can use python, which makes my life much easier.

Since I know I’ve got a lot of work to do on the bots I thought I’d make an even simpler platform to test object tracking. So I put together this little rig that has 2 servos with a laser diode slapped on it so I can just point it and a camera at a wall and start tracking the movements of the laser and control it with visual feedback. Here’s that little rig:botv2-img_20161222_215613_lr

This is really nothing so far, I’ve just been randomly poking at this idea for a few months and decided I should post something to try to get back in the habit of posting stuff I work on. I would be nowhere with my arduino stuff if it weren’t for other people’s little blogs and sites so I figure I’m obligated to offer the same to some other random shmuck trying to fudge their way through an idea that’s really way over their head but they dive in anyway. Here’s to you, shmuck.

WAIT, WHAT?
Yeah- I know. I’m not really sure either.

I’ll post more on this project as it evolves, but it is just kind of an cloudy vision of an idea so don’t hold your breath for anything spectacular. I think the old progranimals and lightning stuff was actually cooler and I should probably be developing that stuff more, but this is what I feel like doing now so I am.

Jun 212015
 

It’s Fathers Day and mine recently created something pretty darn poignant regarding the treatment of the U.S. Flag. You should watch it.

There is a lot of lip service about patriotism these days, and very little genuine understanding about America as a people, a nation, a government, or even just a freaking geographic area for that matter. Fortunately we still have a solid foundation that we could pick ourselves back up from and maybe one day- learn to stand tall and be proud of ourselves as Americans for the right reasons. I know we still have a chance because I can still love and publicly criticize America right here within it.

We’ve got a lot of work to do to make this place anything near worthy of how well we market it to ourselves, but in the meantime, we can at least show respect to those who helped build America’s legitimate foundation by doing what my father suggests in this video, just doing the darn housekeeping for crying out loud.

Happy Fathers Day Dad, and Pop and DeeDee, I know they appreciate this video a lot. I think this video is kind of a fitting fathers day gift for them too. I know this little diatribe doesn’t quite fit with the simple purity of your message in the video, but I don’t think you’ll mind me stepping up on my soap box for a minute, you did kind of help teach me about the whole free speech thing and using my brain and whatnot, can’t unring that bell- lol.

God Bless America, and also teach us to share ALL of our blessings with the whole world, like our fathers shared their blessings with us.

Jun 142015
 

So here’s something I should have posted quite awhile ago. This is a ‘breakout board’ I made for controlling LED strips with Arduino over bluetooth or USB.

PCB

It’s a little PCB for an Arduino nano, bluetooth chip, and (3) 4-pin terminal blocks with PWR-GND-SN-CK, and a 2-pin block for PWR-GND input. It can be used with 3 or 4 pin LED strips, or anything really, servos, sensors, whatever. I designed the PCB using Fritzing and had it printed through OSH Park. If you’re interested in one let me know via the funkboxing contact form. I’ll offer just the bare board or a pre-soldered and tested version with a nano, bluetooth chip, and terminal blocks, or any variation thereof. I can’t really come up with a price that makes sense, so if you tell me what you’re planning on using it for and I think it’s cool I’ll probably give it to you pretty much at-cost.

And here’s an LED array I made with it.

Array

It uses (64) WS2811 LED’s in an 8×8 array, an Arduino Nano, bluetooth chip, and a USB battery pack. It’s a double-decker sandwich of (3) 1/2″ thick acrylic plates so the whole rig is excessively large and heavy, but I kind of meant for it to be that way. I wanted something as a little testbed for 2D effects, of which I’ve only made a few so far. I’ll post the 2D array code at some point, but I should clean it up and make sure it works with the newest FastSPI library first.