Inspired by Ryan's post on the 4 things he'd like invented, except I'd put a focus on things that I think are actually workable. Here it goes.
Intelligent automotive gearboxes
Can I confess? I'm a self-confessed closet automotive gearbox fanatic, so this has special meaning to me.
While modern-day automatic automotive transmissions these days fare quite well, they have 1 major drawback. They're dumb.
Scenario 1: You're going uphill. The car slows down. You mash on the accelerator trying to get it to downshift quickly so you won't lose too much speed uphill. But nothing happens.
Then, 5 seconds later, the engine rpm falls to a value where it starts downshifting, and because it's shifting under a huge amount of torque, it lurches violently.
Any driver who isn't braindead driving a manual transmission would shift gears earlier, doing a smooth transition and then speeding up the hill.
Why? To put it bluntly, the automatic transmission is unable to judge the gradient or predict the gradient of the road ahead.
Scenario 2: Guy driving a manual transmission sees lights turning red. Guy slowly brakes in 4th gear, without bothering to sequentially drop the gears while he brakes to a halt.
Guy driving an auto transmission brakes. Car downshifts from 4 to 3. Lurches. 3 to 2. Lurches again.
BUT: Guy driving a manual transmission slows down to join a pack of cars which had just been moving off from a red light that had just turned green. Guy shifts into gear 2 immediately and speeds off together.
Guy driving an auto transmission lurches from 3, and before he slows enough for the gearbox to downshift to 2, he accelerates, and the engine putters for a couple of seconds before the gearbox decides to shift to 2.
No fun.
Why? Automatic transmissions can't read road situations.
My solution? An artificial intelligence system that reads current road gradient using a pendulum weight kinda thing, change in gradient ahead using infrared sensors and a camera to read the traffic situation.
It'd be a challenge to program an algorithm that reads the camera and determines if it's a red/green light, if the car ahead has its brake lights on, and the distance between the car and the objects and stuff, but I guess it can be done.
Modular computing
This needs little introduction. Every part of your computer comes in different shapes and sizes. Video cards, hard disks, they have different connectors, different power sockets, different places to mount them.
How about a Lego-style system where you just connect the parts together, irregardless of what they are, without care for what goes where? You don't even need to open up the computer casing.
Parallel cables have been superseded by USB connectors with just 4 pins. Serial-ATA connectors use embarassingly few wires in the cable as compared to the old 80-wire IDE cables. So why not modular connectors too? It'd work.
Computerised medical diagnosis
We're living in an age of cookbook medicine. There's so much information, so much clinical trial data out there, it's impossible for any human mind to grasp all of that at once. Therefore we have clinical practice guidelines compiled, where everything is condensed into algorithms, objective biochemical targets and stuff.
The truth is, 90% of the time, doctors are applying cookbook medicine.There's the lofty talk about tailoring treatment to patients and stuff. But really, would you go against the guidelines and protocols and take the risks in an increasingly litigious medical profession? So go by the cookbook. Tried. Tested. Evidence-based. Modify it if necessary.
I'm envisioning a system where a computer can ask relevant questions about signs and symptoms, and using existing data, rank diagnoses based on probability and provide treatment suggestions based on established guidelines. And use artificial intelligence to update itself on the local prevalence of disease presentations and stuff like that.
Saves lots of error, and makes sure doctors don't miss out on stuff.
Michael Crichton has quite a bit to say about this idea in his book Five Patients.
But then, the old fogeys would never accept it!
Language-specific data compression algorithms
Ebooks are getting more popular. Legit or otherwise. They're large chunks of text, and are often left uncompressed, or at best, using som generic compression system like ZIP or RAR.
Words like 'because' and 'however' are a Scrabble player's wet dream, but they appear so often in plain text and yet take up 7 whopping bytes in an uncompressed text file.
Letters like 'z' appear so uncommonly, you won't be increasing the file length much if you used 2 or 3 bytes to represent them.
A system can then be programmed to analyse texts from various sources (technical manuals, fiction, nonfiction, etc) and determine the prevalence of words that are long yet often-used, and use shortforms to encode them. And vice versa, short-and-uncommon letters and words can be encoded using less efficient ways.
Of course, you can do the same for subparts of words, such as the suffix 'ing'.
It can also re-analyse the text to be encoded to build a further database, so that words specific to that text would be represented by shortforms. You'd expect a book on how to use Microsoft Word to have the word "Microsoft" represented alot more often than other texts.
==
If you didn't understand me, it's OK. I'm rambling.
Feel free to steal my ideas and patent them, I'm not going into those fields anyway.