When I look at evolution of new technologies, one commonality that attracted my attention was that they all strive towards abstraction. When it gets to solving a problem, we want the problem or rather the solution to escalate to a level where all the underlying levels are either already solved or taken care of by someone else. For example, a calculator is a fine example (may be a very shallow example..:)). When you use it, you implicitly assume that it is doing the right thing. You dont care about the algorithm or circuitry it uses. All you care about is that, it gives you the right result. And, a CEO or someone who gets the results from an accountant who uses the calculator probably doesnt want to know the sequence of calculations that were used as long as he gets his result and so on. For someone who is a benefactor of technology, it is a boon to not having to know all the underlying mechanism. But, for someone who develops the technology, is it true?
In life, the interesting stuff happens only during transitions. I think, managing transitions is what life is all about. Whenever people talk about anything, they
only talk about the first few minutes or hours or months or the last few... Anything that happens in between is either overlooked or not talked about...The reason why transitions are more exciting is probably because it is also the most vulnerable part. Anyway, that was a digression of my thought flow... What has this got to do with abstractions? When technology goes through transitions, things dont always look good. As abstractions are abstracted, they are unstable and some people almost always dont want everything to be abstracted since they lose a sense of control over what they do. So, technological transitions suffer this major hurdle as it paves it's way through.
One of the hot topics today in technology is virtualisation. Virtualisation of all kinds. In a nutshell, what businesses want to do is to give users an interface to manage stuff and everything else will be taken care of behind the scenes. For example, you dont really need to have a computer. You will have an interface that will emulate what you get out of a computer. You could ask for a computer with linux or microsoft OS with xyz applications and you will get it on demand. Your disk storage will be on demand. Your internet speed will be on demand. You will be given services which will make sure that your data is secure. You dont have to worry about software upgrades. You just pay for a single service... Abstraction of technology to a level where the user doesnt have to tinker around anything by themselves... If you are a "server hogger", you may not like this. If you are a sceptic about security, you might be concerned. But, this is the model that will be followed as technology progresses. You will hear terms like Virtual Machines, Virtual Platforms, Virtual I/Os and so on.
To me, virtualisation is just one facet of abstraction. The interesting thing about abstraction is that, businesses can capitalize on peak consumption or lack thereof. Look at airlines, they always try to oversell tickets with the assumption that not everybody who signed upto travel show up. Same is the case with internet bandwidth. You will be amazed at how much of oversubscription happens on a given link. Again, the same is true with computer utilization. How many of us use our computers to its maximum computational capacity. We all aspire for the fastest computer though..:) As technology gets abstracted to the next level, industries can make use of the inefficiencies in individual's usage so that more people think they have all the computational power they need... It is good for the consumer because they dont have to deal with installing new software or hardware which in some cases, might require a PhD even for the "skilled"... Though abstraction is fact of life, I still think there is a sort of vacuum that it leaves behind for the fresher aspirants...
Tuesday, August 14, 2007
Abstraction
Posted by Suresh Sankaralingam at 9:39 AM
Subscribe to:
Post Comments (Atom)
10 comments:
@mindframes: I think i might have to disagree on a few of the points:
1. abstraction is important to the developer too. It just depends on what one is developing. Think of the life of a software programmer- they started with Assembly language and realised the language didn't abstract most of the aspects of the CPU- so people programming where essentially dealing with the idiosyncrasy of the cpu. They faced a whole lot of problems, so they came up with higher level languages which in reality where abstracting the underlying hardware (to some extent in some cases and a large extent in other cases). Now programmers where focussing more on the problem than the hardware and the compiler was dealing with deabstraction.
Same thing can be thought of in every level of a technology developer.
2. I'm not sure the purpose of virtualisation is for efficient usage of resource. In reality look at a sun box that scales from a few user to a few thousand users.. it still handles the demands seamlessly. Unused cpu cycles by one user is just used by the other in SMP model. Same for bandwidth or disk usage.
In a server scenario- I always though virtualisation give you the ability to run whatever platform (OS) you want. So a virtualised server will be running linux for me and windows for you.
Also the other gain is actually security (rather than it being a concern). In a multi user environment- a kernel panic will bring all users down. But in a virtualised environment it will only bring down my virtual machine, while you chug along happily. Same with worms viruses and hacks.
@mano: I am not saying that abstraction is bad if that is your interpretation. I am just saying that it leaves a vacuum if the developer doesnt understand what he is doing. I do agree that a developer is more productive by using abstracted layers. But, when it comes to building something new, he is left with his abstracted layer which may not be a standard across different types of products. You could argue that such an information is not needed for a layman programmer who doesnt care...I agree with that.
I think you didnt understand my point about efficient resource utilisation. Different processes sharing a single server (possibly with multiple core) is different from different processes sharing multiple servers (blades). If the virtualisation software is "smart" enough, it can load balance in an efficient way. The idea is to give a deterministic performance for a given application and not distribute it based on availability, which is what a multithreaded OS does. If what you say is true, multi-user operating systems like Unix inherently offers all the capabilities of a virtual environment, dont they? I dont think they have security issues as well...:)...Advantages like security and protection are irrefutable. But, in my opinion (and many others), peak cpu and memory utilisation guarantees is the key...
A concise/cleaner summary (in my opinion) of my paragraph 2...
Virtualisation decouples the hard limit of mapping a physical resource to its user. This allows demand based resource utilisation... Also, it allows failover guarantees by binding a process (user) dynamically to a new physical resource seamlessly... The physical resource can be a server or a group of servers or link bandwidth or whatever... Am I making sense?
@mindframes: Wow the second version of the second para was really nice. And I think I get what you are saying with para 1 too. Ofcourse having a whole understanding of any system is better., but in most cases as systems become complex, abstraction becomes more essential. But still, no arguements to what you say, more knowledge is inherently better.
Nice writeup btw., it made think, which is no mean task. :)
Your comments made me rethink...:)...Thanks dude...
Very interesting concept.
Now that providing bandwidth is getting cheaper and cheaper people have to come up with different mechanisms to make money.
NetPC, the joined venture by Oracle & Sun may be little too early to market. Nobody expected the bandwidth availability to grow so fast so soon.
@Mindframes,
Thanks for the clarification. Your comment and of course after talking to you about it, I can understand virtualisation better now.
And I do agree about abstractions. I like to have control and it bugs me when things are at an abstract level and I don't understand what is happening.
@ Mindframes: Very interesting blog. Took a few readings to let it sink though. While I can see your point on abstractions, I revert back to a point I made in a previous blog (assuming I understood your point). Each one of us are specialists in our own areas. Therefore, neuro surgery is not abstraction to a neuro surgeon, it probably is to you and me and vice versa with our respective fields.
If we extrapolate the argument to daily life, would it not be inefficient use of resources if each one of us were to try and figure out everything that happens around us. The idea of a society and the dependencies induced by living in societies could potentially be at risk.
What say ppl???????
I am not suggesting that everyone of use should try to figure out everything that happens around us.. But, from your example, a neurosurgeon better know all the details and not restrict himself to operating (working on) a robot that performs the surgery...
Another classical example is to use Excel to find optimal equations (like best fit) for a given problem... If you use it without knowing what it tries to do, you cannot appreciate its advantages/disadvantages... Assuming that a given abstracted layer is good is not good...That's my point...
@ Mindframes: Perfect. Now i get what you are saying. Basically given that we are experts in our field, we should know what is going on. It is a common problem these days and I agree . I see that a lot in research. People use canned software like Excel or SAS to run regressions, without knowing how exactly the software does its computations. Sometimes canned programs make a mess of things. Excel's regression function is really bad and not dependable. Which is why researchers are encouraged to do things is use a language like Matlab, R or C++ (for us). Using this forces one to learn exactly what is going on behind the scenes.
Post a Comment