Sensebridge: Difference between revisions
No edit summary |
Elgreengeeto (talk | contribs) No edit summary |
||
Line 44: | Line 44: | ||
**#*What modulations of the signal are possible; intensity, spacing, direction, timing...? Based on that, what is the theoretical bandwidth of that input method? | **#*What modulations of the signal are possible; intensity, spacing, direction, timing...? Based on that, what is the theoretical bandwidth of that input method? | ||
**#*How sensitive is the average person there? How much of the signal can the brain interpret meaningfully (too much noise vs. too weak of a signal)? Based on that, what is the <em>practical</em> bandwidth that we might expect? | **#*How sensitive is the average person there? How much of the signal can the brain interpret meaningfully (too much noise vs. too weak of a signal)? Based on that, what is the <em>practical</em> bandwidth that we might expect? | ||
==Current Projects== | |||
[[Compass Vibro Anklet|Compass Vibro-Anklet]] | |||
==Cybernetic Theory== | ==Cybernetic Theory== | ||
Line 50: | Line 55: | ||
Pseudo-cybernetic loop showing neural integration of artificial sensory system, [https://www.noisebridge.net/images/8/8f/Sensory_Addition_%281.5_order%29.jpg diagram]. | Pseudo-cybernetic loop showing neural integration of artificial sensory system, [https://www.noisebridge.net/images/8/8f/Sensory_Addition_%281.5_order%29.jpg diagram]. | ||
==Brain/Body Output== | ==Brain/Body Output== | ||
Line 55: | Line 61: | ||
The study of output is a question in how we can interface with machines beyond the manual button pushing and lever pulling. Some ideas in this field are biomechanical feedback, pulse detection, eye-tracking, etc. | The study of output is a question in how we can interface with machines beyond the manual button pushing and lever pulling. Some ideas in this field are biomechanical feedback, pulse detection, eye-tracking, etc. | ||
==Interesting questions== | ==Interesting questions== | ||
Line 80: | Line 81: | ||
VibroHat in action at BIL 2008: | VibroHat in action at BIL 2008: | ||
http://www.flickr.com/photos/quinn/2307779651/in/photostream/ | http://www.flickr.com/photos/quinn/2307779651/in/photostream/ | ||
==Meeting Notes== | ==Meeting Notes== |
Revision as of 00:06, 25 March 2009
Sensebridge: The Noisebridge Cyborg Group
"If you can't beat the robot armies, join them."
We are concerned with human-machine interfaces.
Terminology
To avoid confusion (where possible) we'll try to use these terms.
"Sensor" is the device that detects external stuff that the body (probably) can't detect. "Display" is the way the device shows data to the body (not necessarily visually). This gets around the device-centric / body-centric problem that shows up when using the terms input & output. "Armature" is the physical component of the device: the belt, hat, armband, etc; distinct from the electronics.
Sensors (Input to the body)
Our first task will be to experiement with data input modes. What signals can we overlay upon the body's standard senses, and by what means? Of particular interest are electro- and vibro-tactile stimulation. What bandwiths are available according to the nature and location of input stimuli?
Secondly, we need to identify useful measurements to input. Sensory augmentation has been around for centuries, think of magnifying glasses or hearing aids. Sensory substitution, on the other hand, involves the transposition of one sensory stimulus into another: seeing sound, hearing distance, feeling vision, etc. Sensory extension is measuring signals beyond regular human abilities, such as magnetometry, radar, echo location or even digital signals coming from a wireless network carrying such diverse information as locations of friends in crowds or network activity on one's website.
Component Sourcing
Here are pages for discussing sensors, displays, and armatures in terms of price and usability.
We are ordering some RBBBs. Go here RBBB ORDER SIGN UP!!!1!one!
Stimulus Sensor and Input Method Common API
- What might that mean?
- A standard protocol that can:
- encompass most sensory data of the sort we are interested in researching
- allow for presentation of any those data through any of a wide variety of modalities
- A standard protocol that can:
- Why?
- Plug-and-play(ish) interoperability of sensors and input modes. Build three sensors and three input modes, and we have 9 possible combinations, und so weiter.
- Simplify development by separating sensing and presentation design/tasks.
- How?
- On the data sensing end:
- Identify a primary set of interesting stimuli, consider various encodings thereof.
- Abstract away from the physical phenomena the stimuli represent and look instead at common patterns in the data streams.
- ???
- On the data presentation end:
- Identify and prototype a number of input methods.
- For each method think of:
- What modulations of the signal are possible; intensity, spacing, direction, timing...? Based on that, what is the theoretical bandwidth of that input method?
- How sensitive is the average person there? How much of the signal can the brain interpret meaningfully (too much noise vs. too weak of a signal)? Based on that, what is the practical bandwidth that we might expect?
- On the data sensing end:
Current Projects
Cybernetic Theory
Cybernetic loop of GPS + vibrating hat, diagram.
Pseudo-cybernetic loop showing neural integration of artificial sensory system, diagram.
Brain/Body Output
The study of output is a question in how we can interface with machines beyond the manual button pushing and lever pulling. Some ideas in this field are biomechanical feedback, pulse detection, eye-tracking, etc.
Interesting questions
What new sense do you want? What existing sense might it map onto?
The brain seems to integrate consistent input quite easily. What happens after this integration has happened, when the input stops (i.e., you take off the belt after wearing it for days)?
Links
Youtube of experimenter stimulating his face with small electrical pulses to produce externally controlled movements: http://www.youtube.com/watch?v=dy8zUHX0iKw
http://transcenmentalism.org/OpenStim/tiki-index.php
http://openeeg.sourceforge.net/doc/
http://www.biotele.com/ - remote control cockroach
http://www.instructables.com/id/Breath-powered-USB-charger/
VibroHat in action at BIL 2008: http://www.flickr.com/photos/quinn/2307779651/in/photostream/