Dev Log - Guitar Trainer - Part 3 - Components
For the main part of the application, we'll need a couple things.
- The 3D visual interface
- The audio input interface
For the visual part, we'll need a fretboard, and a note track, at the very least. We'll need some ability to tween the camera position. We'll need a way to serialize the exercise's notes, and preferably a good way to input notes by hand, because creating any kind of decent exercise builder is going to be a lot of work.
I'll be using three.js with postprocessing for the rendering.
For the audio input, we'll rely on the WebAudio API to get the raw signal, and some sort of digital signal processing (DSP) library to make sense of it. For my first version I had to learn a bunch of music theory and about fast fourier transforms and stuff, and came up with my own basic pitch detection algorithm, but we'll see if we can do better when we get there.
But let's talk about the 3D stuff first, as it's a lot simpler to tell when you have that working, and it'll provide a useful framework to help us tell when the audio input part is working. In programming, I find that starting with whatever gives you the most useful feedback that your whole system isn't broken is a good move. You can get some confidence from things like unit tests, but there's nothing like seeing it working, and it can be tricky to write tests for stuff like 3D scenes, or D3 graphs and whatnot.
Whenever you delegate rendering, it's harder to test, and harder still when it's something like WebGL or Canvas. With a lot of D3, you can at least look at the DOM, but I hardly know anything about WebGL. We'll mostly just have to trust three.js is going to do its job, and we'll verify that the three.js objects are configured the way we expect.
Makeup of the Scene
Conceptually, we have a camera, the fretboard, and the notes, which fly into the fretboard.
The camera and the fretboard move together, but the camera should be free to zoom in and out to cover different sized sections of the fretboard (e.g. to be able to see a large slide), or to shift the focused section (e.g. when you shift hand position). The notes fly into the fretboard, but functionally, maybe the fretboard flies into the notes. That's an implementation decision that has to do with what you're going to do with the geometry and stuff later. One way might be more efficient in terms of moving geometry, for example. Either way, we have two basic parts, POV (Fretboard and Camera) and Note Track, and one of them moves. We'll think about which one is better more when we get to it.
For the fretboard, we're going to want it to give some visual feedback upon receiving user audio. Have it be a heatmap of what notes are sounding on the guitar. We'll need individual string segments between frets that we can color separately.
We'll need a way to specify the scene's component geometry and cameras, how we want to configure the renderer and effects, and viewports. We'll look at this in detail in the next part.