Learn About Me
My name is William Chapin, but everyone calls me Liam.
I’m from Fredericksburg, Virginia and attend Virginia Tech as a graduate student in Computer Science.
Professionally, I am a robotics software engineer. I build software which allows robots to move safely, manipulate objects in their environments, and communicate with one another in a collaborative way. This software also enables users to analyze the capabilities of robots that exist only in simulation, providing useful information on what would work or wouldn’t work in constructing a prototype robot.
I believe robotics lies at the perfect intersection of computer science, mechanical engineering, and electrical engineering. Consequently, I have driven myself to be capable in both disciplines not my own. I can and will use AutoCAD to design parts, use 3d printers and lasers to manufacture them, and then design the circuits and power systems that will bring them to life. Flexibility is key to low TRL research.
Personally, I am a tinkerer. My desire to learn, and my passion for discovery are insatiable. It is not unusual for me to pick up a hobby simply so I can learn enough about it to consider myself accomplished in the field. I fix vintage arcade machines with my Fiancé. I enjoy working on my truck and going offroading, camping in the wilderness. I’ve learned how to build server systems and host my own applications. I build drones so I can FPV race, and attractive custom mechanical keyboards so I can feel fancy whilst coding the software which powers it all. There’s nothing which can’t be hacked, taken apart, investigated, and improved upon. It’s relaxing and fulfilling, and that’s what I desire from my workplace also.
What I do
What good is robotics without experimental technologies and copious research? I have multiple publications in the field with more on the way, particularly in space robotics, autonomy, and optimization.
Joanie is a Seeburg KD200 Jukebox* which was given from my Dad to my Mom as his wedding present to her. Throughout my whole childhood, it had been positioned near the kitchen, where it would play whilst we cooked or ate.
Over the years, Joanie wore down from constant use, requiring visits from various jukebox repairmen. I learned from watching them the basics of how they would work on the machine, even though they all hated it, due to Joanie's unique* nature.
Finally in 2018, Joanie shut down for good, no longer able to make record selections, or push music through its amplifer and speakers. It sat in my Mom's kitchen until 2022, when she gave it to me so that I could begin the process of repairing it.
To replace Joanie's position in her life, her fiance has given her a Seeburg 100G named "Genie" which now sits where Joanie once sat.
As of now, Joanie is working, and playing again in my and my Fiance's home.
*Joanie actually only looks like a KD200 on the outside. One of the things that made Joanie such a particularly difficult jukebox to work on is that the internals (selector and amplifier) come from a seeburg 100Q, and the wiring for Joanie was modified to fit these new additions. Why these elements are used can only be guessed at. My guess is that whoever did that conversion wanted to move away from the failure-prone circuit-board based units that existed in a stock KD200. Oh the irony.
Basic Robotics is a python package I've developed over my time at Virginia Tech, and through needs and wants encountered in various projects I worked on or supported indirectly. Almost all of my research in robotics has been incorporated in this package or used it in some way.
Some key features include
Basic-Robotics (and the optional) Basic-Robotics-Workspace package can be found on my Github or downloaded directly from pip, with qucikstart examples built in.
In the midst of the pandemic, my old toyota RAV4 began to die, so I started looking for a replacement. Living in the mountains of Southwest Virginia, I wanted an offroad capable vehicle. Specifically, one with a manual transmission. Jeeps didn't interest me due to their reliability issues, so the hunt for a Toyota was on.
I found this FJ at a dealership in New Jersey. It had previously lived its life in Florida, so frame rust was minimal, and it had a sunroof put in, which was an interesting mod that came stock on exactly zero FJs.
I named the FJ 'Fred Johnson' as a alternate bacronym of the FJ name, and also because I'm a big Fan of the Expanse. In the grille of the FJ, waterjet OPA logos can be seen, and the rear window has a large 'OPAS FRED JOHNSON' sticker.
Since getting Fred, it's been on numerous adventures, and I continue modifying it and hacking it to turn it into some kind of offroad batmobile.
CIRAS, or Commercial Infrastructure for Robotic Assembly and Servicing was a project I worked on at NASA Langley during my longest internship there, in 2017/2018.
The objective of CIRAS was to design as system wherein a team of robots could assemble a truss structure from base components, of theoretically infinite length. For the purposes of our demonstration, this length was set to be two one meter square bay truss sections.
The overall system consisted of two robotic subsystems comprised of four independently controlled sub-robots. The LSMS, or Lightweight Surface Manipulation System, served as the LRM (Long reach maipulator). Attached to the end effector of LSMS was SAMURAI, or Strut Assembly, Manufacturing, Utility & Robotic AId, which handled the struts themselves. The second system consisted of NINJAR (NASA Intelligent Jigging and Assembly Robot), a stewart platform type robot, which served as the truss jig. NINJAR sat atop a separately controlled turntable which allowed LSMS to access any face of NINJAR.
My role in the CIRAS demonstration was to design the software which allowed the robots to interact together. I wrote control software for each robot, the central control software which coordinated them, trained the operations by running them in 'joystick mode,' and oversaw a successful demonstration of truss assembly by the robotic team.
Stewart platforms (SPs) possess immense strength and stability when compared to serial arms, at the cost of having very limited workspace. In order to overcome this shortfall, stacking stewart platforms can yield a robot with the strength inherited from stewart platforms and the workspace potential of a serial mechanism. The drawback to this paradigm is that the resulting robot is vastly overactuated, and control is therefore complicated. However, this vast overactuation leads to redundancy in the system, which is a major advantage for extraplanetary exploration.
The first stacked stewart platform robot I'm aware of is the Logobex LX4, but my introduction to the concept came with ASSEMBLERS. ASSEMBLERS... the acronym has been lost to time as the program took on an identity of its own. It describes a system of lower cost, modular robots, which can collaborate together to, well, assemble. Neat.
ASSEMBLERS was conceptualized and started by Dr. Erik Komendara as an ECI program at NASA Langley Research Center, and was later continued by Jim Neilan when Dr. Komendera left for Virginia Tech. The primary demonstration objective was intially to have a group of these robots work together to construct a habitat for use on the moon. It was also conceptualized that they could link together to form robots with other kinematic topologies, such as a hexapod walker. I wrote software to model this.
LSMS or the Lightweight Surface Manipulation System is an initiative which was developed by NASA Langley Research center to create a tendon-actuated long reach manipulator for use on the lunar or martian surface. The technology that enabled LSMS is closely related to TALISMAN, also developed at NASA Langley.
When I joined NASA Langley to work on the CIRAS project, LSMS was to serve as the project's long reach manipulator. My role was to enable this. I worked on the original LSMS software to create new capabilities for ease of control, and enabled local control using joint encoders on the system itself, so it wasn't as dependent on the lab metrology system.
In a later internship, I returned to my LSMS work, and created a more modern, modular variant of the LSMS software which was also compatible with the Robot Operating System (ROS).
During my time at FASER Lab, we've created a slightly changed version of the LSMS, led by FASER students Jacob Martin and Dominic Bisio. My role again, was software, creating the control software for this new arm, so that it could be hand driven, or driven autonomously.
That is me on the left, controlling the arm.
FORCE is my go-to for testing any new robotic capabilities, especially in Basic-Robotics. Shown is FORCE navigating a complex environment utilizing collision detection and RRT.
Self Hosting and open source softwares have long fascinated me.
I got my start in High School with a modifiable open source discord bot named Red. I quickly programmed Red to function as my personal assistant and monitor for an old laptop of mine which I dubbed 'the server.' Creative, I know. My instance of Red was renamed 'Speedwagon' as from JoJo's Bizzare adventure, which started a tradition of all my servers having JoJo related hostnames. Speedwagon's primary function was maintaining game servers for my friends and I. Messaging it could result in game servers being launched, or shut down, and Speedwagon would watch resources and shut things down for inactivity.
As time went on, my needs and capabilities evolved. I discovered I could get surplus enterprise grade hardware cheap second hand, and created a 10gb fiber network for myself, and expanded to having a full server rack. Some of the services I host there consist of Github, Dokuwiki, Home Assistant, Portainer/Docker, Speedwagon (of course), an inventory management system of my own design, various game servers for my friends, a reverse proxy, photo management solution, etc.
My current configuration consists of the following servers:
Oh how I would love to install solar panels.
At the start of the pandemic, we were all banished to our homes, to ride out the quarantine. I, being a maker, had a number of 3D printers in my small apartment, and I knew my lab also had a number of them which were no longer being used.
In an effort to do something positive for them, I asked that I could borrow the printers in order to manufacture PPE, such as face shields, for the hospitals in the area.
The effort ultimately morphed from simply providing prusa-designed printing designs from my lab's printers out of my apartment to something much larger. A university effort was formed to produce shields, with help requested from the community. I was at the forefront of the effort, coordinating shield donations, manufacturing the laser cut components, and designing new shield variations. At the peak of production, there were 13 printers in my small 600sqft apartment, most of which owned by the university, all producing shields 24 hours per day.
As the project went on, I began to design new variants of the shields in keeping with design requests from local hospitals, and worked with Virginia Tech's TREC Lab in designing, manufacturing, and research. The effort included many people at virginia tech, and several departments working in tandem.
At the close of the project, we, the greater blacksburg community, had manufactured several thousand face shields, which were distributed to various institutions in the local area.
Design never stopped as the project continued. Toward the end of the project, I created a version of the shield which could be flipped up, and featured size customizability for an individual user. Unfortunately, it never entered mass production by the community as demand for shields fell.
LSMS Gantry is a system wherein the LSMS would be used as a type of 3D printer, large enough to print whole habitats on an extraterrestial surface.
I conceptualized the project as an arc polar coordinate 3D printer. The primary problem with arc polar printers is that the further away they get from their axis of rotation, the less accurate their prints get. At the scale of the LSMS, this had the potential to be a major problem.
My solution to this problem was to undersling a stewart platform to the LSMS's forearm, on a sliding gantry. This stewart platform would have the capability to compensate for the LSMS's loss of accuracy at greater distances from its axis of rotation, and also to perform dextrous manipulation tasks, serving as an end effector.
During the 2020-2021 school year, I had a group of senior design students build a physical prototype of the gantry system, and I've also published a workshop paper on the concept.
The presence of the gantry increases the overal range of motion of the LSMS, and decreases the amount of motion required from the LSMS's main motors, when using the gantry to print. Here's a vide of a simulated LSMS printing a habitat.
TRON is a game by Bally/Midway which was released in 1982. Raiders of the Lost Ark (Atari) is widely credited as the first game based on official movie licensing. However, TRON is the first game to be developed in-tandem with a movie, and released before Raiders. TRON is also noteable for its multiple minigames and extensive theming. I had played the original arcade game several times, mostly at MAGFEST, and it was near the top of my and Sam's aquisition list.
When we got it home, we discovered that the monitor tube was destroyed, and the marquee also, but the rest of the components were intact, albiet slightly to extensively water damaged, depending on the part. With some tweaking, we were able to get the power supply to start up and get the lights on the day we got it home.
More complicated proved to be the Bally MCR board stack (which we had to send out to an expert), and the monitor chassis. We replaced the tube in the hopes it would bring it to life, but we suffered a persistent vertical collapse. Ultimately it turned out to be the yoke itself, not the chassis, which was an embarrassing discovery.
We've stabilized the wood, replaced the failed components and bad artwork, and now TRON is very much playable, and one of the centerpieces of our home Arcade.
When I was in high school, I became interested in drones. The first drone I built was a Lynxmotion Hunter V-Tail 500. I figured, go big or go home. I wanted a challenge for my first build. A challenge I did get. I can't say how many times I crashed it while learning to fly it in acro mode, but each time I did, I rebuilt it. I'd like to say 'better than before' but that wasn't always the case.
From the V-Tail I branched out to racing quadcopters, especially FPV drones, and joined the Virginia Tech drone racing team. I particularly enjoyed freestyle flight, performing various tricks with the more high powered of my quadcopters, such as these. I did eventually get good enough that I could pull those off with my trusty old V-Tail without breaking it. or even with micro sized quadcopters inside a home.
Going back to 'big' I also built a 7' wingspan RC plane for autonomous operations (shared with some of my drones), and some larger quadcopters which I used in hackathons, such as one pictured below, which I configured as a flying intermittent mesh network with a 'carrier pidgeon' messaging system which earned me a top prize at William and Mary's Cypherhacks hackathon.
Now, nearly 10 years on from the start of this journey, I'm proud to report that the V-Tail still works, and is better than ever.
Another one of my research platforms that had its origins in high school/early community college, I got this Trossen phantom hexapod prototype off of ebay directly from Trossen. It came with the original Arbotix microcontroller with XBEE remote interface. It ran only their existing gait software.
The first thing I did with it was change the gait software to run a port of Zenta's Phoenix code. This dramatically imroved the gaits available to me to use.
Next I tried my hand at creating custom PC-based control software in python by creating a version of the Trossen commander application which I could run with an xbox controller. This interface is what I used in all my later projects with the robot.
I've used the hexapod as a development platform in several hackathons, but my best achievement with it was creating it to be a search/recovery robot, using OpenCV, onboard lighting, and various ultrasonic sensors. The objective was that it was supposed to navigate an environment, exploring exhaustively, until it located a human face, at which point it would report back through the control interface. This won me the wolfram ingenuity prize at Hack Duke.