aboutsummaryrefslogtreecommitdiff
path: root/posts/opening-black-boxes.md
blob: f776ceab4a688612c7d5c0b6f43c1d88340dfc74 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
% On Opening Black Boxes or: How I Learned to Stop Worrying and Love G-Code
%
% 28 Nov 2019

![Baby Yoda, engraved. ([G-code](baby-yoda.nc))](baby-yoda.png){width=100%}

**TL;DR** PhotoVCarve should not cost $149. I made [my own](https://github.com/built1n/rastercarve).

Recently I've gotten my hands on a 3-axis [ShopBot milling
machine](https://www.shopbottools.com/products/max). For the
uninitiated, a CNC mill is essentially a robotic carving machine --
think "*robot drill*": you put in a piece of wood/foam/aluminum,
program the machine, and out comes a finished piece with the right
patterns cut into it. I had the idea of
[engraving](https://en.wikipedia.org/wiki/Engraving) a raster image
using the machine, and there happens to be a nice piece of software
out there that claims to do just that: Vectric's
[PhotoVCarve](https://www.vectric.com/products/photovcarve).

There's just one problem: PhotoVCarve costs $149. Now, I have no
qualms paying for software when it makes sense to do so, but in this
case, $149 is simply excessive -- especially for a hobbyist. And
besides, just see for yourself in the video below: all PhotoVCarve
does is take an image and draw a bunch of grooves over it -- *nothing
that couldn't be done in a couple lines of Python,* I thought.

<center>

[![PhotoVCarve - Engraving Photographs](https://img.youtube.com/vi/krFyBxYwWW8/0.jpg){width=60%}](https://www.youtube.com/watch?v=krFyBxYwWW8)

</center>

## G-Code

The first step in the process was figuring out *how* to control a CNC
machine. Some Googling told me that virtually all machines read
[G-code](https://en.wikipedia.org/wiki/G-code), a sequence of
alphanumeric instructions that command the movement of the tool in 3
dimensions. It looks something like this:

~~~ {.numberLines}
G00 X0 Y0 Z0.2
G01 Z-0.2 F10
G01 X1.0 Y0
~~~

These three commands tell the machine to:

1. Go to (0, 0, 0.2), rapidly (`G00` is "rapid traverse").
2. Go to (0, 0, -0.2), slowly (`G01` commands a slower move than `G00`).
3. Go to (1, 0, 0), slowly.

My program just had to output the right sequence of G-code commands,
which I could then feed into the ShopBot control software. (This was
far simpler than I had originally imagined.)

At this point, one of my flow states kicked in. I sat down, and got to
coding.

## The Program

The development process was surprisingly straightforward -- I put in
perhaps a total of 4 hours from my initial proof-of-concept to the
current viable prototype. There were no major hiccups this time
around, and even though I'm still in the process of learning it,
Python made things *so* much easier than C (or God forbid -- [ARM
assembly](adieu-quake.html#asm-listing)).

The heart of my program is a function,
[`engraveLine`](https://fwei.tk/git/rastercarve/tree/src/rastercarve.py?id=c2de4a3258c3e37d4b49a41d786eef936262f137#n118) (below),
which outputs the G-code to engrave one "groove" across the image. It
takes in a initial position vector on the border of the image, and a
direction vector telling it which way to cut.

~~~ {.python .numberLines}
# Engrave one line across the image. start and d are vectors in the
# output space representing the start point and direction of
# machining, respectively. start should be on the border of the image,
# and d should point INTO the image.
def engraveLine(img_interp, img_size, ppi, start, d, step = LINEAR_RESOLUTION):
    v = start
    d = d / np.linalg.norm(d)

    if not inBounds(img_size, v):
        print("NOT IN BOUNDS (PROGRAMMING ERROR): ", img_size, v, file=sys.stderr)

    moveZ(SAFE_Z)
    moveRapidXY(v[0], v[1])

    first = True

    while inBounds(img_size, v):
        img_x = int(round(v[0] * ppi))
        img_y = int(round(v[1] * ppi))
        x, y = v
        depth = getDepth(getPix(img_interp, img_x, img_y))
        if not first:
            move(x, y, depth)
        else:
            first = False
            moveSlow(x, y, depth)

        v += step * d
    # return last engraved point
    return v - step * d
~~~

After this was written, it was a simple exercise to write a driver
function to call `engraveLine` with the right vectors in the right
sequence -- and that was all it took![^1] (I really wonder how Vectric
manages to charge $149 for this...)

I fired up the program on a test image and fed its output into
ShopBot's excellent G-code previewer. [Success](#top)! I added a
couple of tweaks (getting the lines to cut at an angle was fun) and I
christened the program
[*RasterCarve*](https://github.com/built1n/rastercarve).

The G-code that produced the image at the top of this post is
[here](baby-yoda.nc). In addition to the ShopBot previewer, Xander
Luciano has an excellent online [simulator](https://ncviewer.com)
which can simulate this toolpath.

## Conclusion

This was a fun little project that falls into the theme of "gradually
opening up black boxes." G-code, I learned, isn't nearly as hard as it
might seem. It's all too easy to abstract away the details of a
technical process, but sometimes the best way to really understand
something is by opening up the hood and tinkering with it.

---

## Appendix: Machined Results

Here are some examples of engraving results, along with the
corresponding G-code files.

![Into the Jaws of Death. 30° carbide engraving bit, .080" maximum depth, 110% stepover. ([G-code](d-day.nc.zip))](d-day.jpg){width=100%}

![Original image. ([Source](https://en.wikipedia.org/wiki/Into_the_Jaws_of_Death#/media/File:Into_the_Jaws_of_Death_23-0455M_edit.jpg))](d-day-orig.jpg){width=100%}

[^1]: I'm probably oversimplifying here. There was, in reality, some
neat vector math to figure out just *where* the "border" of the image
would be when the grooves were at an angle.