AlfaEditor: The 3D projective tactile and sound dimensional explorer for the blind.
Learning maths while blind ain't easy.
let me introduce you to some pain points I go through in my quest to knowledge. This is valid for every VI individual out there. I think it's important to educate the educated in shifting their perception towards more equality and rights to access content in an academic environment.
My journey into maths started 4 years ago, when I took bridging classes at Taylor college in Auckland, New Zealand. I wanted to refresh my math skills to enter a degree in computer sciences. When I rocked up and told the staff I wanted to take calculus and statistics, I was met with an uneasy silence. This is because there are no math tools for educating the blind students in the STEM subjects. This is when perception versus reality becomes a hard fact.
Math is hard for everyone. it is an acquired skill, not something that comes as naturally as language. It requires a lot of practice. But without a proper way to understand all the techniques involved in solving math problems, one will be left frustrated and hanging against an arithmetic shift or identifying one out of thousands of identities, just not being able to move forward. Just to make things interesting, take away sight completely, and go back to challenge yourself a bit further down the rabbit hole.
What are the main issues here? Visualisation. Math is solved by finding patterns and associating them to theorems which can simplify the process of getting a correct answer. Visualisation goes in the form of graphs, understanding how a function works over its domain, what its range is, what special properties this function holds. Sometimes, pure visual intuition can nudge most of us towards the right direction. This is absent when you don't see those graphs. it gets better (cynicism) as you go up the math ladder towards more complex concepts such as derivations, ODEs or PDEs, multidimensional vectors and matrices. There is a slew of obstacles I have met throughout which I won't list but the most impactful.
1-Because it's hard, you have to spend a lot more time to lock those concepts in memory, because nothing is accessible from the get go, and you can't rely on illustrations as they are not suited to non visual representation.
2-it takes longer during exams to solve the problems handed to you. My worst experience was spending 8 hours, starting at 8 AM, with 2 5 minute breaks, to complete the final exam for my calculus class. My teacher gave me some mars bars, knowing very well that my brain was going on overdrive to solve these problems.
3-Don't expect anyone to understand what you need. Even if the staff do everything to help, there is rarely any precedent in formally adjusting math education for the blind. So you're left with your own torch to search through the labyrinth of ways to attempt conquering this beast, and not many trailbrazers out there who could even give you a starting point. Just as an edifying example, in the over 150 years since it was established, the University of Queensland, located in sunny Brisbane, has not had a single student ever who has enrolled in a bachelor of Math, Engineering or related programs. And I'm not blowing smoke here. I would have rather been very content not having to find the most efficient way to deal with maths. But my previous life has helped me map out something of a workable system to get there.
This open source project aims to provide tactile and sound based feedback for various materials as per below:
Here is a video demonstrating what I mean:
Using Apple's SceneKit framework, AlfaEditor provides a 3D world connected to the leap motion (see Leap Motion) to project an image on the platform to an object within the volumetric boundary captured by the leap motion. The leap motion can detect hands and fingers, gestures and assign them to controls through it's API. The OSX application uses a 3D framework to render the objects and provide more information to the user as the hand is moved around, hitting tactile bumps or reliefs on a flat printed object.
Objects are imported from .svg (Simple Vector graphic) files and placed as a flat plane in the 3D world, associated to tags and descriptors which can be pulled out from the editor and in the future, we want to add a couple of things to it:
- Complete the editing modes for imported scenes to attach HTML tags and provide further information by loading an NSWebView when tapping on an object
- Use Convolution Neural Networks and QRCodes printed directly on the physical object to provide freedom of rotation. Currently the A4 is placed in a bounding pedestal which holds the leap motion downwards. However, the Infrared has difficulties picking up certain objects when faced against reflective materials.
- Create more interesting use cases such as 3D point source sounds for when exploring a map
One of the more interesting use cases is the ability to have objects identified from 3D prints within the volume which can be placed around and monitored by the leap motion. Imagine playing a game of chess, in which space you know what you are touching and the software provides more information about what is around it, using both sound and contextual information. or creating an adventure book, but the pages are printed blocks, objects and surfaces representing the area the user is exploring.
The github repository is available here.
Note: Some examples are included in the examples.zip folder which you should extract in your dropbox folder, first creating a folder named "AlfaEditor".
This is a work in progress, and will be packed with more features as I unify each sub system into a global platform that will cover every aspect of the challenges a blind student can face starting at an early age, all the way through to what I intend to get, a master's degree in one of the STEM topics.
If you have any questions for starting the project, or want to know more about how it works, drop me a line through the contact page.