This is my cat.
His name is Tuco.
He loves fetching these nylon bolts.
And this is Tuco Flyer,
a robotic camera that helps you hang out with tuco on the internet.
The camera is on a gimbal that moves it around smoothly,
And it's connected to a computer running a suite of vision algorithms.
And we built some winches to pull this whole camera platform around the room.
My goal here is a combination of automatic movement and interactive control
So the camera should know how to avoid obstacles and move around the room safely
but you should be able to control it over platforms like Twitch and Let's Robot.
The robot's attached to the ground via a tether that provides power and sends back low latency
uncompressed video to the computer vision and streaming hardware on the ground.
This is important because Tuco's the co-star in my online streaming show which eventually
turns into these edited videos.
You're watching scanlime, and this is my own combination of art, electronics, reverse engineering,
and forward engineering.
Previous episodes went into more detail on the mechanical and software modifications
that make this gimbal configuration possible,
and the winchbot is a DIY take on a device similar to a theatrical winch, with force
and velocity feedback and a wall mount design.
Four of these winches together give the bot freedom to move within my rectangular space.
The winches, flyer, and a control PC communicate through a dedicated network switch.
At this point, the project's hardware is mostly built, but there's a long tail of software
to write and bugs to fix and algorithms to design and tune and maybe redesign.
This is honestly a pretty nerve-wracking part of the project, where the many subsystems
come together and all the unsolved problems and unknown complications start appearing.
We already had some trouble with the gear motors, and there are several things I'd do
differently now if I were designing the winches from scratch.
But after that one motor replacement, all the winches have been holding up.
Overall stability has been another big question in the current design.
The long half-meter carbon fiber boom in this design is part of a compromise.
For the best stability, we want the camera and the winch attachments as close as possible
to the center of mass,
but that would limit how low we could fly the camera without cables bumping into furniture.
This design keeps collision avoidance sensors on the very top of the robot, near where the
cables attach
Then the camera drops down, half a meter in front of these downward-facing sensors.
This design lets the camera fly lower in the room than it otherwise could without a human
pilot guiding it.
Unfortunately, it puts the center of mass somewhat below the wire attachment point,
so the robot will tend to act like a pendulum.
Right now I'm betting that the extra maneuverability will be worthwhile, and I'm also hoping I
can use the motion control software to actively counteract these oscillations.
For now though, it just means the bot takes some time to stop wobbling any time it moves.
If it turns out I can't make the motion smooth enough just by improving the software, the
hardware design is modular and I could add a counterweight on top or move the attachment
point downward.
The software will aim the gimbal using either manual control input or a region of interest
from the computer vision algorithms.
This is the second gimbal tracking algoritm.
At first, it would always keep the region of interest centered in the screen using a
proportional control loop.
This was very sensitive to noise in the computer vision output, and the photographic framing
was kind of boring.
Now I'm using multiple PID control loops for each axis, triggered by the region of interest
crossing into gain regions at the edges of the screen.
As for the winch control, my software is currently very basic and this is where some of the problems
come from.
The plan is for the bot to move automatically in some modes, but before that can happen
the flyer needs to stabilize its height above the ground, and sense nearby obstacles.
The flyer is covered with sensors: an orientation sensor, accelerometer, IR proximity sensors,
and laser rangefinders.
These are all currently being ignored, but in the near future the robot will use them
to know if an object is nearby, and to estimate orientation and height.
It isn't a goal of this project to know exactly where the robot is located in the room, but
we can get a rough estimate by looking at the distribution of weight among the winches.
This estimate will be important for determining how much to move each cord in order to move
the flyer around in 3D space.
Right now the software uses an extremely simplistic estimate, assuming that the room is a perfect
cube and the flyer is always in the center.
This isn't so bad in the center of the room, but my pulley anchor points are extremely
asymmetric so it's not great either.
As the flyer gets further toward the edges of the room, this naive estimate gets worse
and worse, which causes one of the winches to repeatedly bump up against its lower tension
limit.
The tension limits activate a correction procedure which slowly takes up the slack, but if this
motion is too slow it won't keep up and if it's too fast it will cause an oscillation
by passing the flyer's weight too quickly from winch to winch.
This is one of many oscillations I'm seeing now, where multiple parts of the system are
oscillating with each other and I'll need to do more tuning on those systems or add
some damping to address the specific problems I'm seeing.
In this specific case of keeping the tension balanced, I'm sure I could improve the behavior
by tuning parameters, but ultimately the algorithms need improvement also.
The rough position and orientation estimate is necessary in order to know how much cord
motion corresponds to the desired 3D motion.
I also want to somehow anticipate the tension limits by looking at the rate of tension change,
so the correction I apply can happen gradually and not all at the last possible moment.
I want to keep this video from getting too long, but I'll be releasing separate videos
on robot components like the leash and the software stack, so be sure to subscribe if
you're interested.
I'll finish this up with some footage from the flyer as I follow Tuco around the shop,
and I'll describe some of the problems we run into along the way.
That orange rectangle there is my manual control, but when you just see those yellow sprites
marching around Tuco, that's when the computer vision algorithm is trying to track him and
might be doing so successfully.
And seems like we're doing alright so far.
You'll notice he isn't quite in the middle of the frame but when he moves toward the
edge we scroll over toward him.
Now there's a bug I was having at the time, where the gimbal couldn't aim very far downward
or its axes would start vibrating.
Now I can adjust the gains to try and fix the root cause, but when this video was recorded
the best I could do was automatically aim the gimbal toward its home position to recover.
But here you can see that behavior oscillating with the computer vision tracking.
And now it's doing an okay job, but somewhat noisy.
You see the gimbal jerk around occasionally.
This is a symptom of the tracking PID controllers becoming too sensitive to computer vision
tracking noise when the region of interest is so large relative to the screen.
I'll want to tune the gimbal tracking algorithm for sure.
Additionally, once the software is ready to fly the bot automatically, this is when it
would back up in order to keep the region of interest from filling so much of the frame.
It's worth mentioning that the automatic mode is limited in how close to the ground it can
fly, because the sensors and cord attachments at the flyer's saucer area must stay above
all obstacles.
This kind of flight close to the ground will only be possible with a human pilot to spot
hazards around the bot and cables.
If you've been following this project, I hope you enjoyed seeing this progress update.
But I also hope that these videos or the code and design files might inspire you to build
something new or ask new questions about the devices around you.
If you want more detail, I'll be releasing more of these higher level videos so you can
subscribe if you'd like to know about those.
You can also jump over to the archived or live scanlime-in-progress streams, to see
what the process is like in real-time.
This series is possible thanks to many of you who send in hardware or support the channel
with a donation on LiberaPay or Patreon.
Links are all in the description.
Thanks, everyone, and do let me know what you think!
I want to inspire people to understand and modify the technology in their lives, but
it's also important to me to help more people break into an increasingly complicated field.
I'll throw a few bolts for Tuco.
See you next time.
Không có nhận xét nào:
Đăng nhận xét