Sensebridge: Difference between revisions
(adding neuro category) |
|||
(45 intermediate revisions by 14 users not shown) | |||
Line 1: | Line 1: | ||
=Sensebridge: The Noisebridge Cyborg Group= | =Sensebridge: The Noisebridge Cyborg Group= | ||
[[Image:CoinTypePagerMotors.jpg|right|350px|title|Coin Type pager Motor]] | |||
"If you can't beat the robot armies, join them." | "If you can't beat the robot armies, join them." | ||
We are concerned with human-machine interfaces. | We are concerned with human-machine interfaces. | ||
==Motivations== | |||
We want to "make the invisible visible", to bridge our senses. Many group members were inspired years ago by this [http://www.wired.com/wired/archive/15.04/esp.html awesome Wired article], which describes obtaining an <i>unerring sense of direction</i> via a compass belt. Is the brain really plastic enough to adapt to entirely new senses? How natural would it feel after you've fully adapted? Then what happens when you take the device off? | |||
What new sense do you want? What existing sense might it map onto? | |||
==Terminology== | ==Terminology== | ||
Line 9: | Line 17: | ||
To avoid confusion (where possible) we'll try to use these terms. | To avoid confusion (where possible) we'll try to use these terms. | ||
"Sensor" is the device that detects external stuff that the body (probably) can't detect. "Display" is the way the device shows data to the body (not necessarily visually). This gets around the device-centric / body-centric problem that shows up when using the terms input & output. "Armature" is the physical component of the device: the belt, hat, armband, etc; distinct from the electronics. | "Sensor" is the device that detects external stuff that the body (probably) can't detect. Sensors can be divided into those that sense the world, and those that digitize a user's actions. The latter are more commonly called "human input devices", (HID's). | ||
"Display" is the way the device shows data to the body (not necessarily visually). This gets around the device-centric / body-centric problem that shows up when using the terms input & output. | |||
"Armature" is the physical component of the device: the belt, hat, armband, etc; distinct from the electronics. | |||
== | ==Mailing List== | ||
We have created a [https://www.noisebridge.net/mailman/listinfo/cyborg cyborg mailing list] on the Noisebridge server. Please join us! | |||
==Component Sourcing== | |||
Here are pages for discussing: | |||
===[[Sensors|sensors]]=== | |||
detect a property of the world | |||
===[[Displays|displays]]=== | |||
communicate it to the body/mind | |||
===[[Armatures|armatures]]=== | |||
hold the above together and in the proper place on the body | |||
== | They also include a list of all known/speculated types of each. | ||
==Sensor and Display API== | |||
*What might that mean? | *What might that mean? | ||
Line 31: | Line 49: | ||
*Why? | *Why? | ||
**Plug-and-play(ish) interoperability of sensors and | **Plug-and-play(ish) interoperability of sensors and displays. Build three sensors and three displays, and we have 9 possible combinations, und so weiter. | ||
**Simplify development by separating sensing and presentation design/tasks. | **Simplify development by separating sensing and presentation design/tasks. | ||
Line 39: | Line 57: | ||
**#Abstract away from the physical phenomena the stimuli represent and look instead at common patterns in the data streams. | **#Abstract away from the physical phenomena the stimuli represent and look instead at common patterns in the data streams. | ||
**#??? | **#??? | ||
**On the data | **On the data display end: | ||
**#Identify and prototype a number of | **#Identify and prototype a number of display methods. | ||
**#For each method think of: | **#For each method think of: | ||
**#*What modulations of the signal are possible; intensity, spacing, direction, timing...? Based on that, what is the theoretical bandwidth of that | **#*What modulations of the signal are possible; intensity, spacing, direction, timing...? Based on that, what is the theoretical bandwidth of that method? | ||
**#*How sensitive is the average person there? How much of the signal can the brain interpret meaningfully (too much noise vs. too weak of a signal)? Based on that, what is the <em>practical</em> bandwidth that we might expect? | **#*How sensitive is the average person there? How much of the signal can the brain interpret meaningfully (too much noise vs. too weak of a signal)? Based on that, what is the <em>practical</em> bandwidth that we might expect? | ||
Line 48: | Line 66: | ||
[[Compass Vibro Anklet|Compass Vibro-Anklet]] | [[Compass Vibro Anklet|Compass Vibro-Anklet]] | ||
[[Eyes in the back of your back]] | |||
[[Tongue Tingler|Tongue Tingler (Trans-lingual Electric Nerve Stimulator)]] | |||
[[Free hands input device]] | |||
[[Pulse Necklace]] | |||
[[QuickLook Displays]] | |||
[[UltraSonicHearing|Ultrasonic Hearing / Echolocation]] | |||
[[TechnicolorDreamcoat|Technicolor Dreamcoat]] | |||
[[List of senses we do not possess]] | |||
===Miscellaneous Distractions=== | |||
[http://pony.noisebridge.net/~cmaier/recherche_du_temps_perdu/ Edible interconnects] | |||
==Cybernetic Theory== | |||
Cybernetic loop of GPS + vibrating hat, [https://www.noisebridge.net/images/d/d7/Find_Treasure_%28GPS%29.jpg diagram]. | |||
Pseudo-cybernetic loop showing neural integration of artificial sensory system, [https://www.noisebridge.net/images/8/8f/Sensory_Addition_%281.5_order%29.jpg diagram]. | |||
[http://en.wikipedia.org/wiki/Cybernetic Cybernetics page on Wikipedia] | |||
Line 82: | Line 112: | ||
[[Lamont's PWM code]] | [[Lamont's PWM code]] | ||
==Meeting Notes== | ==Meeting Notes== | ||
Line 90: | Line 121: | ||
[[NBC_2009Mar29|March 29, 2009]] | [[NBC_2009Mar29|March 29, 2009]] | ||
[[NBC_2009Apr05|April 5, 2009]] | |||
[[NBC_2009Apr19|April 19, 2009]] | |||
(the meetings continue apace; the minutes, sadly not) | |||
[[NBC_2009July12|July 12, 2009]] | |||
[[NBC_2009July13|Hack Session July 13, 2009]] | |||
[[NBC_2009July19|July 19, 2009]] | |||
[[NBC_2009August2|August 2, 2009]] | |||
[[NBC_2009October4|Oct 4, 2009]] | |||
[[Category:Sensebridge]] | [[Category:Sensebridge]] | ||
[[Category:Neuro]] |
Revision as of 20:18, 11 May 2012
Sensebridge: The Noisebridge Cyborg Group
"If you can't beat the robot armies, join them."
We are concerned with human-machine interfaces.
Motivations
We want to "make the invisible visible", to bridge our senses. Many group members were inspired years ago by this awesome Wired article, which describes obtaining an unerring sense of direction via a compass belt. Is the brain really plastic enough to adapt to entirely new senses? How natural would it feel after you've fully adapted? Then what happens when you take the device off?
What new sense do you want? What existing sense might it map onto?
Terminology
To avoid confusion (where possible) we'll try to use these terms.
"Sensor" is the device that detects external stuff that the body (probably) can't detect. Sensors can be divided into those that sense the world, and those that digitize a user's actions. The latter are more commonly called "human input devices", (HID's).
"Display" is the way the device shows data to the body (not necessarily visually). This gets around the device-centric / body-centric problem that shows up when using the terms input & output.
"Armature" is the physical component of the device: the belt, hat, armband, etc; distinct from the electronics.
Mailing List
We have created a cyborg mailing list on the Noisebridge server. Please join us!
Component Sourcing
Here are pages for discussing:
sensors
detect a property of the world
displays
communicate it to the body/mind
armatures
hold the above together and in the proper place on the body
They also include a list of all known/speculated types of each.
Sensor and Display API
- What might that mean?
- A standard protocol that can:
- encompass most sensory data of the sort we are interested in researching
- allow for presentation of any those data through any of a wide variety of modalities
- A standard protocol that can:
- Why?
- Plug-and-play(ish) interoperability of sensors and displays. Build three sensors and three displays, and we have 9 possible combinations, und so weiter.
- Simplify development by separating sensing and presentation design/tasks.
- How?
- On the data sensing end:
- Identify a primary set of interesting stimuli, consider various encodings thereof.
- Abstract away from the physical phenomena the stimuli represent and look instead at common patterns in the data streams.
- ???
- On the data display end:
- Identify and prototype a number of display methods.
- For each method think of:
- What modulations of the signal are possible; intensity, spacing, direction, timing...? Based on that, what is the theoretical bandwidth of that method?
- How sensitive is the average person there? How much of the signal can the brain interpret meaningfully (too much noise vs. too weak of a signal)? Based on that, what is the practical bandwidth that we might expect?
- On the data sensing end:
Current Projects
Tongue Tingler (Trans-lingual Electric Nerve Stimulator)
Ultrasonic Hearing / Echolocation
List of senses we do not possess
Miscellaneous Distractions
Cybernetic Theory
Cybernetic loop of GPS + vibrating hat, diagram.
Pseudo-cybernetic loop showing neural integration of artificial sensory system, diagram.
Links
Youtube of experimenter stimulating his face with small electrical pulses to produce externally controlled movements: http://www.youtube.com/watch?v=dy8zUHX0iKw
http://transcenmentalism.org/OpenStim/tiki-index.php
http://openeeg.sourceforge.net/doc/
http://www.biotele.com/ - remote control cockroach
http://www.instructables.com/id/Breath-powered-USB-charger/
VibroHat in action at BIL 2008: http://www.flickr.com/photos/quinn/2307779651/in/photostream/
Meeting Notes
(the meetings continue apace; the minutes, sadly not)