Friday, August 31, 2012

A peek into the future! 2020 it is!

Generation Smartphone

It’s the year 2020 and newlyweds Tom and Sara (random!)  are expecting their first child. Along with selecting the latest high-tech stroller, picking out a crib, and decorating the nursery, they download the “NewBorn” application suite to their universal communicator; they’re using what we’ll call a SmartPhone 20.0. Before the due date, they take the phone on a tour of the house, letting the phone’s sensors and machine-learning algorithms create light and sound “fingerprints” for each room.


When they settle Tom Jr. down for his first nap at home, they place the SmartPhone 20.0 in his crib. Understanding that the crib is where the baby sleeps, the SmartPhone activates its sudden infant death syndrome (SIDS) application and uses its built-in microphone, accelerometers, and other sensors to monitor little Tommy’s heartbeat and respiration. The “Baby Position” app analyzes the live video stream to ensure that Tommy does not flip over onto his stomach—a position that the medical journals still report contributes to SIDS.  The NewBorn application suite updates itself with the latest medical findings. To lull Tommy to sleep, the SmartPhone 20.0 plays music, testing out a variety of selections and learning by observation which music is most soothing for this particular infant. As a toddler, Tommy is very observant and has learned the combination on the gate to the swimming pool area. One day, while his parents have their backs turned, he starts working the lock. His SmartPhone “Guardian” app recognizes what he is doing, sounds an alarm, disables the lock, and plays a video demonstrating what could happen if Tommy fell into the pool with no one else around. Not happy at being thwarted, Tommy throws a tantrum, and the Guardian app, noting his parents’ arrival, briefs them on the situation and suggests a time-out.


But the SmartPhone 20.0 won’t be just a high-tech baby monitor. Rather, the device or smart mobile devices like it will serve as nanny, nurse, or golf caddy—the perfect assistant for people of all ages. If you think that people can’t seem to make a move without consulting their phones today, well, you ain’t seen nothing yet.


Let’s age Tommy to 3 years old. Tom and Sara take him skiing for the first time. Tommy’s SmartPhone, now version 23.0, downloads the “Virtual Skiing Coach,” which uses accelerometers sewn into Tommy’s clothing to sense his posture and then offer suggestions for maintaining balance; when it foresees an impending collision, it quickly blurts out instructions on how to stop. We already have basic sensor-based virtual coaches. For example,  some devices use accelerometers and gyroscopes to track motion during rehabilitation exercises and correct errors. Such coaches would enable therapists to remotely monitor home-based exercise, making it easier for seniors to remain at home as they age and reducing health-care costs.


At age 5, with the SmartPhone 25.0 education apps, Tommy has become a curious and eager learner. He looks forward to his first day at kindergarten. He meets Alice, who can neither hear nor speak, but because of her SmartPhone, she is able to easily participate in class. Alice greets Tommy by signing, and her SmartPhone plays a translation provided by the American Sign Language (ASL) app. Tommy responds, and Alice’s speech-recognition app provides her with real time captioning. Tommy shares his favorite song with Alice, sending it from his SmartPhone to hers, which translates the music to vibrators in a vest she wears. One day, Tommy is walking home from school, and the SmartPhone 27.0 Guardian app notices that a stranger has started a conversation with him and is coaxing Tommy to get into a van. The Guardian app whispers in Tommy’s ear not to talk to the stranger and tells him to run to a nearby house, one the app has already verified as a local kid-safe house and confirmed that someone is home. The Guardian app takes a picture of the stranger and the license plate of his van and forwards the information to the police.First Person Vision, introduced at the 2011 CES, uses video taken by wearable cameras and smartphones to identify gestures, actions, and faces in real time. It’s not much of a stretch to envision it alerting users to threats.


For Tommy’s 16th birthday, his parents download the “Driving Instructor” app. Of course, by 2036 cars have many safety features but still require the driver to take over in emergency situations, so a driver’s license is still required. Under the tutelage of the app, Tommy becomes an excellent driver; his parents trust that they’ll be alerted if he starts driving recklessly. These kinds of driver-monitoring tools are now in the lab. For example, the DriveCap project at the Quality of Life Technology Center, in Pittsburgh, run by Carnegie Mellon University and the University of Pittsburgh, uses in-car sensors to track driver behaviour (accelerometers can detect erratic maneuvers and sudden changes in braking and acceleration) and the driver’s cognitive load—that is, how attentive, tired, or overwhelmed the driver is—by focusing a camera on the eyes.


Years later, Tom Jr.’s SmartPhone (upgraded, of course, many times over the years) continues to be a trusted companion. On a business trip, the “Administrative Assistant” app reminds Tom of people’s names and their connections to him; this is an easy-to-imagine extension of First Person Vision. Tom has an appointment in a large building complex, which has a confusing maze of corridors and bridges between buildings. Tom’s SmartPhone snaps pictures for comparison to an archive of pictures of different parts of a building; that’s something the First Person Vision app already does. By locating his position on a floor plan and knowing his destination, the “Building Navigation” app can efficiently guide him to his meeting. Applications like this already exist; the simplest are based on indoor maps developed by Google Places for Business.


On one trip, Tom twists his ankle while jogging. His SmartPhone directs him to the nearest emergency room; iPhone 4s users are already familiar with the Siri app’s ability to do this kind of location finding. Later the SmartPhone recognizes that Tom is using his crutches incorrectly and gives him some pointers. While a “crutches coach” is not currently on the market, similar coaches have been demonstrated in the field. People who use manual wheelchairs are susceptible to repetitive-use injuries to their wrists and shoulder rotator cuffs. Researchers at Carnegie Mellon and the University of Pittsburgh have tested accelerometers in a wristwatch-like bracelet that classifies the arm movements and encourages those patterns that generate the least stress on the wrist and shoulder. Powered wheelchairs are being used to test more sophisticated built-in sensors to help users with spinal cord injuries avoid developing pressure sores by making sure they shift positions frequently; these devices have also been tested at the two Pittsburgh universities. Tool kits already exist for simplifying the development of applications that augment reality, an open-source project supported by the University of Washington; the University of Canterbury, in New Zealand; and ARToolworks, in Seattle. To pass along his father’s life lessons, Tom records video of his father answering a variety of questions. In years to come, Tom’s son will ask questions, which the SmartPhone’s speech recognizer will match with an automatically generated index of the video clips, letting the grandson have simulated conversations with his grandfather.



Even later, Tom’s declining health requires ever more monitoring by his doctor. Fortunately, Tom’s SmartPhone Health app allows his doctor to request routine self-monitoring tests using sensors built into the phone. The app administers these tests to Tom according to a schedule set by the doctor, who can review test results and order additional tests if necessary. The Health app also monitors Tom’s activities and notifies the nurse in the independent living complex where he now resides if there are any anomalies. Today, “health kiosks” that perform these functions are already in use in workplaces and senior living centers .Already, new applications like VitalClip, an iPhone accessory soon to go into a private beta test, allow users to measure vital signs by touching a finger to a sensor.


The apps that help Tom throughout his imaginary life are all straightforward extrapolations from what exists today. But technology isn’t always bound to a straight path. In the future, the SmartPhone and smart communicators like it will decrease in size until sensing and computing is simply part of everyday objects, integrated into the outer “skins” of devices, woven into clothing, and embedded into countertops. This integrated technology will be situationally aware, understanding the user’s intent and jumping in to help without a touch or a voice command. The Tommys of the future will be protected by helmets and uniforms that anticipate potential concussion-causing collisions and quickly react with counterforces that minimize bruising of the brain. Their footballs will signal “first down” from the bottom of the pile of players—no human judgment necessary. Their kitchens will figure out what meal is being made as ingredients are pulled from the refrigerator and step-by-step preparation instructions are displayed on the countertop. All this technology will have a zero carbon footprint, as it scavenges energy from radio waves in the environment and biodegrades when it is discarded. And we can see this future reflected in today’s smartphones.


-TECHNICAL TEAM!

Thursday, August 30, 2012

Happy Birthday Ankita!




To a girl who is always there with a smile on her face, no matter how much amount of work awaits her... Miss Kalamzine wishes Ankita a very Happy Birthday! Stay Blessed and always keep smiling... :)

Happy Birthday! :)

Team of the Month


CONGRATULATIONS!

The team of the month is bagged by Journalism Team for their most active participation throughout the entire month...! Paparazzi - One amazing thing by J-Team...! A must watch page... Keep it up girls... :)

Member of the Month

CONGRATULATIONS!


Member of the Month is Diksha Sharma, III year... Congratulations! :) Her zeal towards the work throughout the month was marvelous and commendable... Keep working... :)

Thursday, August 23, 2012

English is a funny language, just don't blame anyone!




English, the universal language is a peculiar language. Change the
spelling and the meaning changes. A missing character in a word makes
a lot of difference. Seeing the damage caused, you are ‘spellbound.'
Especially when a ‘Show-cause Notice” becomes a “Showcase Notice' and
‘insulating material' turns out be an ‘insulting material'. Never
mind. You did not want to insult anyone. It was only a typographical
error, human error rather.


We frequently come across such errors. Our personality largely depends
on our style of communication. We can either make it or mar it. Error
slips in due to inattention while typing. Normally we are annoyed at a
spelling mistake, sometimes error trigger off ripples of laughter too
, and there we are, laughing our heads off at our own mistakes.
Let us now talk of the errors by typists. Years and years of typing
drafts and further re-typing make their job monotonous. And, typing
becomes a reflex action. Here is a scientific fact in their favour.
Reflex actions or routine operations in a living creature can be
handled by low-level mechanisms that do not involve thinking in the
central brain. The reflex comes to work especially after the lunch
hour. That is precisely when a ‘Turnkey Project' becomes a “Turkey
Project” and a ‘Pump house' changes to ‘Pump hose,' and the ‘world' is
 almost reduced to a ‘word.'

The story of stenos is quite different. A lot depends on the
pronunciation of the boss, noise pollution around, etc. And above all,
the logic of the steno is transcribing her shorthand scribbling. For a
sleepy steno, most of the project would be ‘sleeping' (slipping),
salient features of a machine are ‘silent' and the most ideal manpower
is the ‘most idle'.

The problem assumes greater dimensions when similar syllables appear
in the dictation. Sometimes, 5 MT bullets go empty, the Sole
Distributor is a Soul Distributor, peace is always in pieces and pray
falls a prey. What is more, the Indore Office may become an indoor
office.

It would be an injustice if there is no mention made of the printer's
devil. The circumstances under which printing errors occur are no
different. The difference lies in the publicity they receive.
In a rare case, an error was really a terror. Just think of the
laboratory in-charge, who was sent a sample of contaminated kerosene
and asked to ‘taste' it. If he did what the letter asked him to do, it
was his own fault. He was merely asked to ‘test' it.

By keeping strict vigil, the number of errors in our communication can
be greatly minimised. But you can never reduce it to zero. After all…
to err is human.
Contributed by: ENGLISH TEAM