Thinking about the future. Imagining the future. Getting ready for the future.
Well, you know. The future comes. Like it or not.
If you ask me, it is the nature of life that what you think will happen does not happen. Something else happens.
"The future is here but it is unevenly distributed."
- William Gibson
"We drive into the future using only a rear view mirror."
Marshall MacLuhan
"The future does not extend the past." This sounds good but it is not really true. I would say that we never know what part of today will dominate our lives tomorrow. Or you could say that we never know what part of the past will extend to the future.
While we cannot predict the future we can influence it. And, I think that Wayne MacPhail's own experience is a good example of how this happens.
Wayne was one of the first people to ride the internet. As a staff writer at the Hamilton Spectator, he developed a series of articles on AIDS. They were a popular resource and Wayne was frequently asked to share them. That is, people kept asking him for the articles and they kept asking him for specific information that they contained. A victim of his own success and a programmer since grade 9, Wayne knew that there was an easier way to share his work. He eventually created a series of hyperlinks. (HTML - hypertext markup language). His six articles became 36 text files which became available on campuses and other places around the world.
This is a good example of how "the future happens." Someone takes pieces of present day technology and makes them into something that is much more than the parts.
Really, our current revolution in information and communication is no different from the revolution in transportation that happened in the middle of the 19th century. At least from my perspective, the invention of the steam engine and it's ability to do more work than animals and in more places than water power transformed the way people lived. The revolution in transportation came even further with the refined development of internal combustion and then the application of the assembly line to the manufacture of cars and trucks.
The cost of transportation plummeted time and again for decades.
Wayne MacPail Talked about a "trend to 0." That is, the cost of computing and the cost of communication is currently decreasing so quickly that, for all intents and purposes it begins to approach 0.
As a little look that might be around the corner, the city of San Francisco offers high speed - as in fibre optic high speed at 100 mbps - to some of it's public housing units. After attempts to blanket the city wity wi-fi failed, this solution was less expensive.
Interesting to note that Wayne mentioned Charles Babbage who invented an analytical engine called a "difference engine" in 1822. Babbage's engine used mechanical gears to calculate logarithmic tables. A working model was actually constructed in 1991 and it worked perfectly.
Aida Lovelace, a mathematician, actually created a programming language for Babbage's machine, though it did not actually exist.
Frankly, I'm not really sure what any of this means. I don't know what will come next but I love to see the new ideas that rise out of the muck. Those that fail are often more interesting than those that succeed. To this day, I love Apple's Newton.
Thursday, April 23, 2009
Wednesday, February 18, 2009
3D Without Glasses - Spatial View
Okay. Don't wait. Don't look up other references. Do not pass GO. Do not collect $300.00. Go directly to Spatial View.
I know, the site is sort of cheesy. Well, it's 3D!, fer Chrissake! It's supposed to be cheesy.
Today our Interactive Multi-Media class enjoyed a visit from James Hurley of Spatial View, makers of 3D hardware (and software) that DOES NOT NEED GLASSES(!)
Okay, I'm a 3D fanatic from waaaayyyyy back. If you look at my camera collection you'll find a 3d camera and viewer and if you dig around a little more you'll find pictures of the kids - in 3D - taking a bath (and other exciting subjects). You have not lived until you see dirty bath water and soap bubbles in 3D.
Going back even further, you might know of a theatre director named Jim Warren. Jim is now a distinguished director of live theatre but at one time he was a penniless actor with a brilliant comic turn. He played a clown called "Jerome" who more or less sat on the stage, tried to get peanut butter out of an empty peanut butter jar and, out of sheer frustration (which turns to joy), threw bread at the audience. You had to be there. He also used an (unused) toilet plunger. I saw his show several times and even helped tape it for cable tv.
Well, we started talking and the idea developed to film Jim's show in 3D and install it at the Science Centre as an example of how we perceive depth. (As a side to a side to a side note I cannot find any sort of link for Jim. He's around. We bumped into each other at a party last year and, if I find the money he would - just for this one time - reprise "Jerome" for the 3D camera.)
So how does this "3D without a camera work? Let's get technical.
Do you remember those 3D postcards? That's one technique called Paralax Barrier. The other is called Lenticular Overlay. This is the technique were the eyes in the postcard actually follow you, or one image morphs into another (think of tacky religious images). In both cases, the quality of the image relies on many views broken into tiny pieces and arranged in such a way that one eye will see images from the right and the other from an angle slightly to the left. The effect is not truly three dimensional, more like a series of flat images arranged one behind the other.
Spatial View manufactures a screen overlay and some software to go with it. Clip the glass over your laptop screen, the camera in the computer finds your eyes, adjusts the image (which then shifts to color) and you are looking at a bright and shiney 3D image on your own personal computer. You will soon be able to buy an adapteer for your iPhone and you can play video games with it as well.
You can download plug-ins for several software packages (including Flash which uses the layers to situate images in space. No word on how it might work with the newly introduced Z access in cs4.) The idea that the software can convert 2D programs into 3D sounds interesting but, of course, it depends on how it does so. Flash now allows us to work with depth with the same ease we are used to in placing objects to the right, left or up and down. Now if you make the depth into DEPTH!
James Hurley mentioned a few things about the 3D plug-ins and the most interesting was the plug-in for games. This plug-in does not modify the code, instead it takes the rendering information for the video card and renders several versions of each image so that each eye will view the scene from it's own perspective. This happens in real time and is nothing short of brilliant. Well, we did not see it but it's brilliant anyway.
Look for an iPhone viewer in a few months. You can buy a 19" 3D viewer right now and a smaller (but very bright) screen for your laptop in the spring.
There are a few cool links you might want to explore:
If you've always wanted to go to Mars but just haven't had the time you can visit NASA and the JPL for stereo pics sent to us from the Mars Rover. (glasses required).
Long live 3D! May Blood spurt onto the audience forever!
You can find some nice 3D images on Flickr but you will need to hold a piece of paper sideways to the screen and sort of go cross eyed in order to get the effect.
But, really, the best experience is the Original! Creature From the Black Lagoon in 3D!!!!! I don't think it's coming to a multiplex near you but for only $18.00 (less than two tickets and much less than two tickets plus popcorn) you can see what you missed.
At 3D-Geek you can find Andy Warhol's Frankenstein in 3D. You must be over 18 and twisted for this one.
I can vouch for it. You won't be sorry. (and you can sell the dvd on Craigslist when you are finished watching).
Labels:
3D,
3d games,
3d movies,
autostereoscopy,
no glasses,
Spatial View,
wazabee
Sunday, February 8, 2009
GestureTek - Developing for the Premier's Awards
IMM Class at GestureTek.
Perhaps one of the best things around Interactive Multi-Media (the program that I am swimming through - don't be fooled, swimming is neither easy nor graceful) is that we have the opportunity to develop real projects.
Last term my group (a team of four classmates) made a game for the Girl Guides of Canada. Here's a link to the Girl Guide site but I don't know if our game is up yet. It's called Cookie Frenzy. If you're older than 12 expect to have trouble with it. If you're younger than 12 expect to finish in about 15 minutes.
This term we will be using the ground breaking technology from GestureTek. I enjoyed Vincent John Vincent's history of the company because it reminds me of the time when when bands used to incorporate light shows with their performances. Jefferson Airplane would have huge globs and swirling shapes on an enormous screen behind the band. Early GestureTek consisted of virtual instruments which Vincent John Vincent played from a virtual environment on the stage. Same sensibility. (insert your own reference to drugs here).
From art to advertising, mostly. GestureTek software and hardware is now used to delight children enough to pester their parents into buying the latest toy. But not entirely. The ability to capture real motion in real time and to apply this to virtual worlds will appear in the Vancouver Olympics where users will fly through British Columbia. It's used in educational settings and in rehab medicine; soon to surface on massively multi-player games where it will capture the player's real motion and apply it to an avatar. Yikes! We are the future.
In 1983 with Francis MacDougall. Vincent John Vincent is the “ideas” part of the company while Francis MacDougall leads the development of technology. The company really came into it’s own around 1985 with the introduction of the Commodore Amiga. It was a brilliant machine, and was the first machine that could manipulate a color image from a video camera in real time.
I remember when this machine came out. It was amazing - an entire color television studio control room in a box. It put anything from Apple to shame.
At any rate, when they bought this machine Francis, who studied Psychology, played virtual instruments in a band that eventually toured the world.
I will have the pleasure of using this technology as a member of a three person group which will develop an application for the Ontario Premier’s Awards, a lavish ceremony where awards for the most innovative companies and people are announced. About 300 people will be in attendance, most of whom live their lives on the edge of current technologies. We want to show them a good time and we want to push the GestureTek system into new territory.
It is important to stress that the GestureTek system does not replace or replicate a computer mouse. That is, it does not work well with clicking and dragging. What it does do well is to recognize motion with a rough idea of where that motion is and some idea of where it is going. In short, the system excels at capturing broad movements.
Here’s, roughly, how it works. A video projector throws a computer generated image onto a screen – floor, ceiling, front or rear projection. A digital video camera takes in that same scene as an infra-red image. It helps to have infra-red lighting. When something blocks the infra-red light the system picks it up and the computer software places the motion on the screen.
GestureTek has developed some powerful, yet easy to use software called “Dazzler” that lets anyone project their own images and program what will happen when the system detects movement. Even better, the images and effects can be stacked so that there are many images and effects happening at once. While GestureTek’s pre-made effects work well, it is also possible to develop custom effects in Flash (cs2 only) and to layer these custom effects into the image along with GestureTek’s.
We are looking into the possibility of installing cs4. More to come on that.
All of this means that whatever we make needs to be somewhat simple. I’m sure we will have no trouble learning GestureTek’s technology. And I know that we’ll get great support from Chris Watts. Working in as2 might be a little freaky. It’s a new language and there isn’t really time to learn it. So, in the end, I think we will be working with some simple graphical routines (I found a nice one that draws trees). It helps that there are some nice tweens available in as2.
Our project will be like Chanel – simple and elegant.
Wednesday, September 24, 2008
James Eberhardt of Echo Mobile
James Eberhardt talked to the class about mobile applications.
I had the pleasure of taking James' Introduction to Flash last summer at the Rich Media Institute . It was a great class, only ten three hour sessions but it gave me an excellent grounding in the basics of ActionScript programming and an astute student would complete the program with an excellent set of tools. The key to success in Action Script being, I think, the same as the key to learning any language - "speak it every day." Well, I was lucky enough to land a job as a project manager where I worked from about 6 a.m. until 7 p.m. every day. I don't think I could have learned Pig Latin but, somehow, I absorbed something from James' class by osmosis.
James is something of a Renaissance man. That is, if it's cutting edge and it's interactive, he's doing it. Founding partner of Echo Media and formerly with Marble Media as well as a member of the board of Flashinto and a speaker on mobile technology at all sorts of places around the world. You get the idea. James is there.
James spoke to us about QR codes, a technology that's popular in Japan but more or less non-existent here. QR codes are a nice way to read URLs with your phone but the current cost of mobile internet in Canada makes this - and many other mobile technologies - irrelevant to our domestic market.
The iPhone is changing that.
James is currently programming the iPhone.
iPhones come with data plans that make connecting to the internet while away from your laptop and home computer more or less affordable. And fun. I don't have an iPhone but my friend, Tom Rasky, does. We walked down the the street the other day, his iPhone in hand and our eyes cast downward to its screen instead of looking around at the lovely evening that surrounded us. No problem, though. We watched the little point of light on his iPhone move along Google Maps as we walked down the street. We knew exactly where were were.
Tom is writing for the iPhone too.
Anyone can get the SDK for the iPhone and it has two parts - a GUI and a code interface. The language, "Objective C," gives the developer access to all of the phone's hardware - camera, internet, mail, movement, gps and more. It's not ECMA based which means that its syntax is different from Flash or Java. When asked what it is like to use, James said that it has kept him up for more than a few nights.
To actually sell applications your work must be approved by Apple and they must approve your app. Then you can only sell it on iTunes. Apple takes a big cut.
Here's what interested me most, though James mentioned it only as a aside.
James mentioned an application that might just be in a conceptual phase. To use it, walk down the street, holding your iPhone camera at the passing scene. As the phone scans the images, it looks for logos and when it finds a logo that it recognizes a bubble pops up with info on that brand.
While I can do without the advertising, I love the idea of a mobile device that can pick things out of the real world and expand on them. There is an existing iPhone ap that works like that with music. It's called Shazam. You can find it at the iTunes store (D'oh!). Here's how it works: You hear music. You let your iPhone hear it. Then just like magic your iPhone tells you what it is. You can buy the music from Apple.
I love the idea of being able to point my phone at things and find out more about them.
Getting back to James' presentation. QR codes have been pasted in interesting places. You point your phone at the QR code and, Shazam!, your phone tells you about it. Great for a walking tour of New York.
a QR code:
I had the pleasure of taking James' Introduction to Flash last summer at the Rich Media Institute . It was a great class, only ten three hour sessions but it gave me an excellent grounding in the basics of ActionScript programming and an astute student would complete the program with an excellent set of tools. The key to success in Action Script being, I think, the same as the key to learning any language - "speak it every day." Well, I was lucky enough to land a job as a project manager where I worked from about 6 a.m. until 7 p.m. every day. I don't think I could have learned Pig Latin but, somehow, I absorbed something from James' class by osmosis.
James is something of a Renaissance man. That is, if it's cutting edge and it's interactive, he's doing it. Founding partner of Echo Media and formerly with Marble Media as well as a member of the board of Flashinto and a speaker on mobile technology at all sorts of places around the world. You get the idea. James is there.
James spoke to us about QR codes, a technology that's popular in Japan but more or less non-existent here. QR codes are a nice way to read URLs with your phone but the current cost of mobile internet in Canada makes this - and many other mobile technologies - irrelevant to our domestic market.
The iPhone is changing that.
James is currently programming the iPhone.
iPhones come with data plans that make connecting to the internet while away from your laptop and home computer more or less affordable. And fun. I don't have an iPhone but my friend, Tom Rasky, does. We walked down the the street the other day, his iPhone in hand and our eyes cast downward to its screen instead of looking around at the lovely evening that surrounded us. No problem, though. We watched the little point of light on his iPhone move along Google Maps as we walked down the street. We knew exactly where were were.
Tom is writing for the iPhone too.
Anyone can get the SDK for the iPhone and it has two parts - a GUI and a code interface. The language, "Objective C," gives the developer access to all of the phone's hardware - camera, internet, mail, movement, gps and more. It's not ECMA based which means that its syntax is different from Flash or Java. When asked what it is like to use, James said that it has kept him up for more than a few nights.
To actually sell applications your work must be approved by Apple and they must approve your app. Then you can only sell it on iTunes. Apple takes a big cut.
Here's what interested me most, though James mentioned it only as a aside.
James mentioned an application that might just be in a conceptual phase. To use it, walk down the street, holding your iPhone camera at the passing scene. As the phone scans the images, it looks for logos and when it finds a logo that it recognizes a bubble pops up with info on that brand.
While I can do without the advertising, I love the idea of a mobile device that can pick things out of the real world and expand on them. There is an existing iPhone ap that works like that with music. It's called Shazam. You can find it at the iTunes store (D'oh!). Here's how it works: You hear music. You let your iPhone hear it. Then just like magic your iPhone tells you what it is. You can buy the music from Apple.
I love the idea of being able to point my phone at things and find out more about them.
Getting back to James' presentation. QR codes have been pasted in interesting places. You point your phone at the QR code and, Shazam!, your phone tells you about it. Great for a walking tour of New York.
a QR code:
Labels:
application development for mobiles,
iPhone,
QR,
Shazam
Subscribe to:
Posts (Atom)