Sunday, February 24, 2008
PicLens: Coolest Web photo viewer ever . . .
"Visually, the results are stunning. This Firefox plugin is going to find a lot of fans very, very quickly." - Review on Techcrunch.com
" Coolest Web photo viewer ever . . . It's stunning." -- Review by CNET
Piclens is currently works on:
* Flickr, Smugmug, DeviantArt, Photobucket, Picasa
* Facebook, MySpace, Bebo, Hi5, Friendster
* Image search on Google, Yahoo, Ask, Live, and AOL
* A growing number of Media RSS-enabled web sites
With PicLens, viewing images on the web will never be the same again.
Now available for Windows and Mac.
https://addons.mozilla.org/en-US/firefox/addon/5579
Tuesday, January 29, 2008
Boot It !
And half the screen is covered with a big white stripe,
The vendor won't pay any mind to your gripe,
So boot it. Just boot it.
When you discover that a process won't die,
If kill -9 won't work there's nothing else to try.
Your jobs are dead meat, so kiss 'em goodbye
And boot it. 50 hours of work,
Just boot it, boot it.
And if you can't boot it, shoot it!
When you reboot it, work will be lost.
It doesn't matter what this will cost.
Just boot it. Just boot it.
Just boot it. Just boot it.
When all the characters are coming out weird,
And won't come back right even when the screen is cleared,
You can't fix such things by tugging your beard
So boot it. Just boot it.
If your computer still is running Windows,
And every time it crashes your frustration grows.
When the system's not free, you will always be hosed.
Just boot it. Put a GNU system on,
And boot it, boot it.
Or put it in your horn, and toot it!
It doesn't matter what was to blame.
Till you reboot it, your machine's lame.
Just boot it. Just boot it.
Just boot it. Just boot it.
It doesn't matter what you did wrong.
Till you reboot it, your machine's gone.
Just boot it. Just boot it.
Just boot it. Just boot it.
Monday, January 21, 2008
RMS Sri Lanka Visit BLOGS & PICS
http://www.web2media.net/laktek/2008/01/19/rms-in-sri-lanka/
Hard facts on free software with RMS
Richard Matthew Stallman, popularly known as “RMS”, the founder of the GNU Project and The Free Software Foundation (FSF), was in Sri Lanka this week. He is an American software freedom activist, a hacker, and software developer. In September 1983, he launched the GNU Project to create a free Unix-like operating system. Stallman pioneered the concept of copyleft and is the main author of several copyleft licences including the GNU General Public Licence, the most widely used free software license. He has also developed a number of pieces of, highly used development tools, including the GNU Compiler Collection (GCC), the GNU symbolic debugger (GDB) and GNU Emacs. Stallman co-founded the League for Programmeming Freedom in 1989.
The Nation Economist was able to get a special interview with the software freedom activist. The following are excerpts.
(Q) What is Free Softwear?
(A) ‘Freedom’, I believe, is translated into Sinhala as “Nidahas.” Free Software means, software that respects the user’s freedom. The idea is that computer users should be free. The crucial issue is always: what are the essential freedoms that everyone should have?
They should have four essential freedoms.
• Freedom 1: The freedom to run the programme as you wish.
• Freedom 2: The freedom to study source code and modify the programme.
• Freedom 3: The freedom to copy the programme so you can help your neighbour.
• Freedom 4: The freedom to improve the programme, and release your improvements to the public, so that the whole community benefits.
So this makes the software part of human knowledge. So you can adapt, extend and pass on to each other.
The other alternative is user subjective software or proprietary software/non-Free Software. Software that keeps users divided and helpless. If you are forbidden to share with people, then it is unethical to use it at all. Fundamentally, unethical, because it is attempting to divide people, and helpless, because the users don’t have the source code, as it is kept secret, so they can’t change anything, they can’t even tell what the programme is really doing. This is, fundamentally, unethical, because it is dividing people. These developers are keeping the users helpless. These are predatory practices, which resembles colonial systems. After all, how does colonization work? Divide and rule. Keep people divided and helpless, and then you can get what you want from them. That’s what proprietary software does. It keeps you divided, by saying you are not allowed to redistribute and they keep you helpless, by not giving you the source code. This gives developers power over the user. That power is unjust. Proprietary software is, fundamentally, wrong. The goal of the free software movement is to put an end to this injustice. Our aim is, there should be no proprietary software, that all software should be free. Users of software should have the freedom they want.
(Q) The term “Free Software” has been widely misunderstood. Is it software that is free of charge?
(A) Not necessarily. It is a misunderstanding. It is because, in English, we don’t have a good word for “Nidahas”. We only have the word “free” which can also mean “gratis”. So, it causes confusion. It even took me a few years to recognize these two different meanings of the word “free”, have to be carefully distinguished. In Sinhala and Tamil, these two words explain it well. You have to get the meaning of this word right. It’s not a matter of price at all. It’s a matter of freedom. So, if you think of free speech, it is not free bear, then you will understand free software. I have nothing against programmemers getting paid to write software. In fact, most software programmemers paid to write are not meant to be proprietary, it’s custom software. One client wants to use it and is paying for its development. And that is ethical, as long as the developers respect the client’s freedom. So, most of the software has nothing do with this question. But most users are using proprietary software. That means they are victims. They are under the power of developers, who are usually mega corporations, such as Microsoft, Apple, Dobby, Oracle. There are many of them, as in the case of European colonisation, like some countries managed to grab more colonies and others managed to grab a few colonies, but still, its wrong. So, rather than trying to judge which colonial power is better, we should put an end to it.
(Q) How did you learn to appreciate free software?
(A) It’s trivial for people who are naturally born programmemers. Once you get an idea of programmeming, it is obvious. I read manuals of computers and thought of programmeming. To learn, I had to do it. I was absolutely fascinated by it. By chance, at MIT, I met a free software community. In the lab, the software we had was free. We were happy to share it with anybody at any time.
(Q) Was this the Hackers’ community?
(A) That’s right. We called our selves hackers. To be a hacker meant that you were fascinated by computer programmeming. But, more generally, hacking meant and still means playful cleverness. So, if you enjoy finding opportunities to play at being clever and if you admire other people’s playful cleverness, then you are a hacker at heart.
(Q) But isn’t it unethical to hack?
(A) It might be, in some cases. But generally, no. This term was confused in the 1980s. When the world found out about hackers, they focused on one kind of activity some hackers do. Some hackers, sometimes, do things like breaking security. Why did hackers originally, start breaking security? Because, at University, there were administrators who would stop them from using the computers, usually, for stupid reasons. There would be a computer nobody is using. Then, there would be somebody with something interesting to do, using this computer. Administrators would oppose that person citing rules and regulations. So, this clever person, the hacker, who enjoyed playful cleverness, rather than beat his head against the wall, would just go around and use this computer anyway, for research. It was not a matter of harming anyone. Because, these computers were meant to be used by university students for work and interesting things. So the people who want to use these computers did not let the bureaucrats get in their way.
The reason that they did was because they were fascinated by programmeming and they loved playful cleverness. So, their solution to any problem would be playful cleverness. This was not anybody’s privacy. The computer did not belong to any person. It belonged to the university, for students to use it. They did not steal anything. It wasn’t a bank’s computer. It was just a computer facility at university, meant for research.
I am not in favour of theft. Its fine that banks should have security and I don’t want people to break that security. I don’t want anybody to take my money or your money.
(Q) What happened at AI Lab, once you stared working?
(A) Initially, we had a free software community and eventually, it died due to commercial outside intervention.
(Q) So, that means there were people who had made free software before you?
(A) For sure. It was just the people’s way of life. I did not invent free software. In fact, in the 1950s, lot of software was free. Because, nobody thought of restricting the user. Even in the 1970s, there was still a fair amount of free software. Some operating systems were free software. During the 70s, that mostly disappeared. And by the 1980s, the lab’s free operating systems became obsolete too. Our community died for other reasons.
So, I found myself facing the prospect of looking at the rest of my life without freedom, without community, without anything but, a world of ugliness. I did not want to live that way. I thought, I would make life ethical. I am going to fight for freedom. So I started the free software movement. I did not invent free software. I launched a movement to bring back the freedom to cooperate with other people.
(Q) What is GNU project?
(A) I want to be able to use computers that have freedom and cooperate with people. Computers won’t run without an operating system. There wasn’t one. So, I decided to develop one. I named it GNU. There were about 50 operating systems at that time, but none were free. There were many different kinds of computers with many different operating systems. They were different in technical ways but, in terms of freedom, they were all same.
(Q) When you founded the GNU project, what was the reaction of the public?
(A) The public did not react at all. But, some programmemers were enthusiastic, and volunteered to write part of the system. So, in 1990, we had most of a part of the system. But one important part was missing. That part was the kernel, the programme which allocates the computer’s resources to all the other programmes which it runs. It’s the lowest level of the system. The other parts runs on top.
1992 Linux, which is a kernel, was released. So, when we put together GNU, which was mostly complete and Linux, to fill the last gap, the result was free operating system, which was basically GNU, but contained Linux as well. So, GNU plus Linux is the fair name for it. And ever since, it has been possible to use computers free.
The Community is developing more and more free software. Here you find SAHARA, a rather noteworthy piece of free software developed for disaster coordinating activities. Developers of these can be in one continent, while its Users can be anywhere.
I personally don’t do much of programmeming now. There are many who develop free software now. Most important thing I do now is to spread the idea of freedom. I have always been a freedom fighter. In the 1980’s, the best way I could contribute is writing software, because there weren’t many of us then. What I contributed personally, was an important part in what we did in 1980. It was a substantial part.
(Q) What is the difference between “Free Software” and “Open Source”?
(A) Free Software and Open Source are the slogans of two different movements with different philosophies. In the free software movement, the goal is to be free to share and cooperate. We say that non-free software is antisocial, because it tramples the users’ freedom, and we develop free software to escape from that.
The Open Source movement promotes what they consider a technically superior development model that usually gives technically superior results. Free Software and Open Source are also both criteria for software licenses. These criteria are written in very different ways but, the licences accepted are almost the same. The main difference is the difference in philosophy.
(Q) What are your views on the IT sector in Sri Lanka?
(A) It’s difficult to say. I have been here only a few days. Yet, I can see Sri Lanka has the same problem as in the US, which is most people are using proprietary software. This is a social problem which needs to be corrected. Like Microsoft, they restrict users’ freedom. So, you should not use them. You should use software that can be used in freedom.
You get a lot of practical benefits by using free software. It is almost like having freedom of speech. The particle benefits you get are you don’t pay lots of money to mega corporations, and mega corporations can’t restrict you freedom, because they are using unauthorised copies. Trying to stop people from sharing is evil. Government should never allow that.
(Q) What is your advice to aspiring young software developers in Sri Lanka?
(A) My advice is don’t make the mistake of thinking about software only in terms of practical convenience. Don’t forget about freedom. Don’t forget about social solidarity. Anyone trying to stop you from sharing information, is trying to tax society. Don’t let them get away with it. If you develop software, respect the freedom of the user. Don’t try to subjugate other people and don’t let anybody subjugate you. You deserve to be free.
Friday, August 31, 2007
Software Freedom Day: Taking open source to the streets
On Software Freedom Day, teams of enthusiasts hold local events to educate their local communities, supported by the parent organisation, Software Freedom International, and sponsors such as Canonical and the Free Software Foundation. It's an advocacy event whose the primary motive is outreach -- promoting free and open source software to local communities.
The event is growing. The inaugural 2004 Software Freedom Day had around 12 teams. Pia Waugh, president of Software Freedom International, says, "We had about 180 teams last year, and this year we are expecting about 250."
Events on the day vary from team to team. Activities usually include informational seminars, software demonstrations and training, and product giveaways. Some teams have been known to dress up in Tux costumes to gain attention and direct people to the action.
Running a Software Freedom Day event is beneficial to Linux and other open source user groups, as the impact can last beyond the actual day.
At Mawson Lakes, a suburb of Adelaide in Australia, Paul Schulz' 2006 Software Freedom Day event brought together different enthusiast groups. The most notable result has been the continuing interaction between local Linux user groups and Air-Stream, an Adelaide-based community wireless initiative. Their cooperation recently included a speaker interchange between the LUGs and Air-Stream. Several new LUGs have also been formed in the area due to the interest sparked from last year's event, and the organizers say they have seen a greater level of open source involvement in the local community center.
Another lasting effect can be seen in neighboring New Zealand, where past Software Freedom Day events in Christchurch have paved the way for a Linux and Free Software training programme. "Since we found a more permanent home for our SFD events, in a suburban library ICT teaching centre, we have been able to maintain the positive relationship through monthly Ubuntu and free software live CD-based evening classes," says Rik Tindall, a member of the 2006 Christchurch team. "Many new users have seen this series advertised, and come along for CDs, tuition, and installation help. Thus we also gained a regular schedule and base for training our SFD volunteers. From SFD, the profile of FOSS in the community has therefore been doubly lifted."
Best Event Photo 2006The training programme is so popular that there is also a need to cater for children. Tindall says, "This experience has also identified the need for a creche-like facility on SFD, for the many small children that attend a library with parents or grandparents on any weekend. We will prepare a section of our teaching suite accordingly, as a 'penguin creche,' with several games and puzzle packages ready."
By Melissa Draper
Wednesday, August 29, 2007
Software Freedom Day 2007 Sep 15h --Bigger than Ever
September 15th marks Software Freedom Day, the world's largest celebration and outreach effort about why transparent and sustainable technologies like Free & Open Source Software are so important. Community groups in more than 80 countries organize local activities and programs on Software Freedom Day to educate the wider public about free software: what it is, how it works and its relationship to human rights and sustainability. We already have over 140 teams around the world registered: join them in spreading the word! Registrations for Software Freedom Day teams that want to receive a free SFD team pack close in two weeks, so register now!
"Software Freedom is about creating a digital platform for trust and longevity, particular in a future where more and more of our lives are dependent upon technology" explains Pia Waugh, President of Software Freedom International, the organization behind Software Freedom Day. "It is important we can participate in and trust the software we use in the same way we need to be able to participate and trust in a political system. Ultimately our basic freedoms are only as free as the tools we use, and thus our commitment to Software Freedom."
Support for this year's Software Freedom Day event is fantastic with Google, Mindtouch and the Free Software Foundation coming on board as sponsors as well as long term sponsors the Danish Unix User Group and Canonical. The event also has support from The Open CD, OsCommRes and the International Open Source Network.
"Software Freedom Day is a fantastic event that demonstrates the global reach of open source software." says Jane Silber, COO of Canonical. "We at Canonical are proud to sponsor the event and encourage everyone in the Ubuntu community, as well as the open source community writ large, to participate in this important global event."
Registrations of teams participating in SFD will continue right up to the event however teams who wish to receive a free SFD team pack, including stickers, t-shirts, CDs and balloons must register before the 31st July! So get in quick! There is also an online shop where anyone can purchase t-shirts and packs of The Open CD. Teams get a 50% discount on all prices marked. Teams that have difficult circumstances can write to the Software Freedom International Board with special requests for additional goodies.
Already this year's event is looking bigger and better than ever before, so what will you do to help take Software Freedom to the world. After all, freedom isn't just for geeks.
[ Thanks to Pia Waugh for this article. ]
Thursday, June 07, 2007
CHAMPION OF SRI LANKA’S ICT INDUSTRY VIDYA JOTHI PROFESSOR V.K. SAMARANAYAKE PASSES AWAY

Prof. Samaranayake was the Chairman of the Information and Communication Technology Agency of Sri Lanka from 2004. He was also the Emeritus Professor of Computer Science of the
Prof. Samaranayake served the Council for Information Technology (CINTEC), the apex National agency for IT in
The Government of Sri Lanka has honoured Prof. Samaranayake for his contribution towards IT by the award of Vidya Prasadini in 1997 and the national honour Vidya Jyothi in 1998. The Japan International Cooperation Agency (JICA) has presented its President's Award for International Cooperation to Prof. Samaranayake in 1996 in recognition of his contribution. At its convocation held in January 2005, the
Reshan Dewapura, COO, ICTA said: “Everyone in the ICT industry in
His remains will be brought to Sri Lanka and funeral arrangements will take place in Colombo.
Wednesday, March 28, 2007
Richard M Stallman visiting Sri Lanka
Further to an invitation by the ICTA, Richard Stallman has confirmed that he will be visiting
Exact dates are yet to be finalized, but it will most probably be in the 2nd week of Jan. ICTA expect him to stay 2-3 days and may be a bit longer if he wants to travel around Sri Lanka. We the FOSS community is planning to organize some FOSS events around his visit.
January 2008 is still a long time a way, but it gives us plenty of time to plan a decent agenda and organize some really good events.
ICTA will officially host him, and take care of his travel logistics and expenses.
Friday, March 23, 2007
Top ten things of professional software development has taught me
Maybe it's just me, but coming from Computer Science class I thought that OO was easy. I mean, how hard can it be to create classes that mimic the real world? It turns out that it's pretty hard. Seven years later, I'm still learning how to model properly. I wish I spent more time reading up on OO and design patterns. Good modeling skills are worth a lot to every development team.
2) The difficult part of software development is communication
And that's communication with persons, not socket programming. Now and then you do run into a tricky technical problem, but it's not at all that common. Much more common is misunderstandings between you and the project manager, between you and the customer and finally between you and the other developers. Work on your soft skills.
3) Learn to say no
When I started working, I was very eager to please. This meant that I had a hard time saying no to things people asked of me. I worked a lot of overtime, and still didn't finish everything that was asked of me. The result was disappointment from their side, and almost burning out on my part. If you never say no, your yes is worth very little. Commit to what you can handle, and if people keep asking you for more, make it very explicit that this would mean not doing something else. What I did was to have a list of stuff that I needed to do on a piece of paper with me. When someone asked for something, I showed them the list and asked what I should bump to have time to help them. This allowed me to say no in a nice way.
4) If everything is equally important, then nothing is important
The business likes to say that all the features are as crucial. They are not. Push back and make them commit. It's easier if you don't force them to pick what to do and what not to do. Instead, let them choose what you should do this week. This will let you produce the stuff that brings value first. If all else goes haywire, at least you've done that.
5) Don't over-think a problem
I can spend whole days designing things in front of the white board. That doesn't mean it will be any better, it just means it will be more complicated. I don't mean to say you shouldn't design at all, just that the implementation will quickly show me stuff I didn't think of anyway, so why try to make it perfect? Like Dave Farell says: "The devil is in the details, but exorcism is in implementation, not theory."
6) Dive really deep into something, but don't get hung up
Chris and I spent a lot of time getting into the real deep parts of SQL Server. It was great fun and I learned a lot from it, but after some time I realized that knowing that much didn't really help me solve the business' problems. An example: I know that at the table level, SQL Server will not take an IU lock - it will only take a IX lock. This is a performance tweak, since most of the time, the IU lock will have to be escalated into a IX lock anyway. To find this, I spent countless days experimenting, I read loads of material and talked to Microsoft people at conferences. Have I ever had any use of this knowledge. Nope.
7) Learn about the other parts of the software development machine
It's really important to be a great developer. But to be a great part of the system that produces software, you need to understand what the rest of the system does. How do the QA people work? What does the project manager do? What drives the business analyst? This knowledge will help you connect with the rest of the people, and will grease interactions with them. Ask the people around you for help in learning more. What books are good? Most people will be flattered that you care, and willingly help you out. A little time on this goes a really long way.
8) Your colleagues are your best teachers
A year after I started on my first job, we merged with another company. Suddenly I had a lot of much more talented and experienced people around me. I remember distinctly how this made me feel inferior and stupid. I studied hard, reading book after book but I still didn't catch up. They had too much of an advantage on me, I figured.
Nowadays, working with great people doesn't make me feel bad at all. I just feel I have the chance of a lifetime to learn. I ask questions and I try really hard to understand how my colleagues come to the conclusions they do. This is why I joined ThoughtWorks. See your peers as an asset, not competition.
9) It all comes down to working software
No matter how cool your algorithms are, no matter how brilliant your database schema is, no matter how fabulous your whatever is, if it doesn't scratch the clients' itch, it's not worth anything. Focus on delivering working software, and at the same time prepare to continue delivering software using that code base and you're on the right path.
10) Some people are assholes
Most of the time, most of the people around you are great. You learn from them, and they learn from you. Accomplishing something together is a good feeling. Unfortunately, you will probably run into the exceptions. People that because of something or other are plain old mean. Demeaning bosses. Lying colleagues. Stupid, ignorant customers. Don't take this too hard. Try to work around them and do what you can to minimize the pain and effort they cause, but don't blame yourself. As long as you stay honest and do your best, you've done your part.
Tuesday, March 13, 2007
The death of computing science discipline
Neil McBride says computer science was populated by mathematicians and physicists but now virtual robots can be created by eight-year olds without needing programming, logic or discrete mathematics skills. Does that mean we have a dying discipline?
We all know there's a crisis in university computer science departments. Student numbers are dwindling - down 115 just last year.
At the same time the computing unit of funding has fallen. And the onset of fees has made students think twice about joining a profession where the plethora of new jobs in the 1990s has reduced to a trickle and it's only just looking as if employment prospects may be on the upturn.
Dropping numbers of A Level students, a view that IT is a job for geeks and social misfits and a perception that there’s nothing interesting in computer science doesn’t help. Even the value of the research base is being questioned.
And the problem's global. In the US, the number of students choosing computer science dropped by 39 per cent between 2000 and 2005. In Australia, cuts in IT academic staff are the order of the day.
In such dire circumstances, it's tempting to hanker after the glory days when computer science ruled, departments were full, and students flocked to a leading edge discipline where the ideas were fresh.
It's easy to be nostalgic about the days when the income from computer science subsidised other departments and computing was the Prince of the university not the Cinderella.
We long for the days when assembler programming ruled, when programming was exciting and leading edge, when distributed computers were being created and there were uncharted vistas of applications to be written, and single applications such as ledgers and transaction systems transformed businesses. But that is the past. Today the ship is holed below the waterline.
As the ship sinks, we computer scientists fiddle on the deck hoping to avoid the icy waters. We claim, as the President of the BCS has recently, that there is still a massive need for computing students in the UK today.
We look to games programming for our salvation, designing games programming courses and reducing a wide-ranging industrial discipline to a set of geeks programming computers to zap spacecraft and dismember aliens.
It's a sorry sight to see computing academics fighting for the last few lifeboats. But the heyday of massive liners, full of programmers, plying the commercial sea-lanes is over. There may be room for a few luxury liners, but most of us fly on budget airlines.
It's easy to think that the problem is that people (read potential students) just don't understand how exciting computing is and that this can be fixed by a bit of sharp marketing, slick videos and some school visits. But the students are not that gullible. The real nature of the problem lies at the roots of the discipline.
Something significant has changed. There is the smell of death in the air.
In the early days, computer science was populated by mathematicians and physicists excited at the prospect of vastly accelerated computation. New languages were developed. FORTRAN, Algol, COBOL, and PL/1 took root. The foundations of programming were laid.
There was excitement at making the computer do anything at all. Manipulating the code of information technology was the realm of experts: the complexities of hardware, the construction of compliers and the logic of programming were the basis of university degrees.
The power of hardware has increased, as IBM 370s in air-conditioned warehouses gave way to computers in the home and advanced robots become this year's Christmas toy.
However, the basics of programming have not changed. The elements of computing are the same as fifty years ago, however we dress then up as object-oriented computing or service-oriented architecture. What has changed is the need to know low-level programming or any programming at all. Who needs C when there's Ruby on Rails?
Now vastly complex applications for businesses, for science and for leisure can be developed using sophisticated high-level tools and components.
Virtual robots - Zooks - can be created by eight-year olds without needing programming, logic or discrete mathematics skills. Web designers build complex business sites, graphic designers build animations, accountants assemble business systems without needing to go object-oriented.
Computer science has lost its mystique. There is no longer a need for a vast army of computer scientists. The applications, games and databases that students once built laboriously in final year projects are bought at bookshops and newsagents.
If the gap between public knowledge and academic curriculum isn't large enough, the gap between academia and industry practice is a gaping hole. While academic departments concentrate on developing new computer systems in an ideal organisational environment, a lot of industry has moved away from in-house development to a focus on delivering a service.
As commercial software products have matured, it no longer makes sense for organisations to develop software from scratch.
Accounting packages, enterprise resource packages, customer relationship management systems are the order of the day: stable, well-proven and easily available. IT departments now focus on contracts, tenders, service level agreements, training, system usage and incident management.
Interrupts, loops, algorithms, formal methods are not on the agenda. IT is about deploying resources to meet the information needs of its customers.
Implementation, facility management, systems integration, service management, organisational change even environmental audit are the language of IT. These hardly feature on computer science courses.
The environment within which computing operates in the 21 century is dramatically different to that of the 60s, 70s, 80s and even early 90s. Computers are an accepted part of the furniture of life, ubiquitous and commoditised.
Like cars, a limited number of people are interested in their construction, more live by supporting and maintaining them; most of us accept them as a black box, whose workings are of no interest but which confer status, freedom and convenience.
Indeed, whereas building a new car needs mechanical know-how, building a new computer application can be done by the user who has no grounding in computer science.
Computing is also affected by globalisation. The loss of jobs in IT and the declining computer science enrolments is a global problem for developed countries. Since the software product can be transmitted almost instantaneously, why develop it in expensive facilities in the west?
Armies of highly trained computer scientists are available in India, Sri Lanka and China. The expertise is easily off-shored. In India, over 100,000 new IT graduates a year are ready to support an off-shored IT industry.
Companies like Microsoft, Hewlett-Packard and Siemens have well established software development operations in India. Why are we not co-operating more with the Indian IT industry?
So where does that leave computing departments in universities? Do we pull up the drawbridge of the castle of computational purity and adopt a siege mentality: a band of brothers fighting to the last man? Or do we recognise that the discipline is dying if not actually dead, and breathing shallowly.
The old man has run his race well. He has changed the nature of human existence. Do we let go gently, realising that the discipline has run its course? The actual problem is one of the perception of the proponents of our discipline, not the potential students.
The old generation needs to look to a new generation, to new approaches. The focus is moving away from system construction. The jobs are in the application of technology. There is a need to be closer to the application, closer to the user, to replace a reductionist, convergent discipline with a complex, divergent discipline.
The complexity of embedded systems, of modern computing applications requires a different way of thinking. A reductionist, programming mindset does not adapt well to uncertainty, emergent behaviour, the unexpected and the study of the whole.
Relationships are important. The new computing discipline will really be an inter-discipline, connecting with other spheres, working with diverse scientific and artistic departments to create new ideas. Its strength and value will be in its relationships.
There is a need for innovation, for creativity, for divergent thinking which pulls in ideas from many sources and connects them in different ways.
The new computing department will be the department of interdisciplinary studies, drawing ideas from biology, design, history, medicine and contributing a rich computing foundation to those disciplines. It will be looking outwards rather than inwards, concerned to address the vast landscapes of computing application.
So how many computer science departments will exist in 30 years time? Perhaps a few will support the elite luxury liners. Most will have given way to interdisciplinary study departments, and computing service departments, producing innovative graduates who can corral and manage the IT resources organisations need.
Computer science curricula are old, stale and increasing irrelevant. Curricula needs to be vocational, and divergent, widening the computing student's view of the world, not creating a sterile bubble, closed off from the wider issues in the world, and from the networking, the integration, the global reach of computers.
There is a need for a drastic rethinking of what the discipline is about. There is a need for new curricula which represents a real paradigm shift, not just a move from keyboards to pen computing.
Here at De Montfort I run an ICT degree, which does not assume that programming is an essential skill. The degree focuses on delivering IT services in organisations, on taking a holistic view of computing in organisations, and on holistic thinking.
Perhaps this represents an early move towards a new kind of computing discipline. As the roots rot and the tree falls a vast array of new saplings appear. Those saplings may be the start of a new inter-discipline: new computing for the 21st century.
Neil McBride is a principal lecturer in the School of Computing, De Montfort University.
Thursday, March 01, 2007
Introducing iPhone

Wednesday, February 21, 2007
Meet "The Artist" of the Google Logos

The Korea Herald: How long did you live in Korea as a child? What was it like?
Dennis Hwang: I was born in Knoxville, Tenn., but moved to Korea when I was about five years old. My hometown was Gwacheon where I had a very normal childhood. I went through public schools like everyone else, spending six years at Gwacheon Elementary School and two years at Munwon Middle School. Actually, much of my ideas and style stem from the time I spent during my childhood in Korea. Whatever challenges the logos bring, I can often rely on the little doodles that I used to do in school when I was young. Something that used to be frowned upon turned out to be my greatest asset.
Herald: When did you move back to the United States?
Hwang: I came back in 1992 when my father received a Fulbright Scholarship to research in the United States.
Herald: What was it like going to an American school all of a sudden?
Hwang: I was placed in a public middle school but was completely unprepared for it. I didn't speak a word of English. For the first six months, I couldn't communicate with the teachers or students. With the help of ESL programs though, I got better. My father returned to Korea, but my brother and I decided to continue our education in the States. My parents have made unimaginable sacrifices for us over the last 10 years, and I wouldn't be this successful without their support.
Herald: What was the first logo you designed for Google?
Hwang: Google had been using outside contractors to do the earlier logos, so the first project I got was modifying the Fourth of July logo in 2000. The two founders, Larry Page and Sergey Brin, wanted something more fun, so I redrew parts of the image. The next logo was for Bastille Day, which is the first logo I did from scratch.
Herald: Which letters are your favorite targets for manipulation?
Hwang: Understandably, the "O" and the "L" are the easiest to deal with. The "O" has become a Halloween pumpkin, a Nobel Prize medal, the Korean flag symbol and the planet earth. The "L" has been used as a flagpole, the Olympic flame cauldron or a snow ski. The first "G" is the most difficult to deal with, and I don't think the "E" has gotten much action because of its location.
Herald: How did you come to do the Korean Independence Day logo?
Hwang: Google makes a big effort to recognize holidays that aren't necessarily mainstream. The Korean Independence Day logo was seen globally by tens of millions of people. Numerous Korean-Americans wrote to thank us on Aug. 15 last year. Many expressed how proud it made them to see the Korean flag.
Herald: Do you have plans for other Korea-related logos in the future?
Hwang: I'll definitely to a special logo for Korea hosting the World Cup.
Herald: You're only 23 years old. What are your future plans?
Hwang: Who knows? It's very important to me that I can work both technically and artistically. Google is a perfect place to do that. It allows me to have a programming job while letting me express myself artistically, with the added bonus of having my work be seen by tens of millions of people in a single day.
Herald: What is your favorite letter among the ones found in the word "Google?"
Tuesday, February 20, 2007
Windows VISTA - Microsoft's Biggest MISTAKE
Monday, February 19, 2007
64 DVDs on a disc: holographic storage to ship
The initial Tapestry HDS-300R holographic disc drives shipped in December to beta users
The Tapestry HDS-300R, which is the first rev of the product, will use a write-once format suited to regulatory agencies and is aimed strictly at the archival market for industries such as IT, health sciences, government agencies and professional video recording. InPhase plans to come to market with a re-writable format disc by the end of 2008.
"We're not going to play in the backup market at all," said Nelson Diaz, CEO of InPhase, in Longmont, Colo.
InPhase's first generation product has a data transfer rate of 20MB/sec., 100,000-hour meantime between failure rate and a 50-year expected lifespan. By the end of 2008, InPhase plans a second-generation 800GB optical disc with data transfer rates of about 80MB/sec., with plans to expand its capacity to 1.6TB by 2010. Diaz said his company plans to make all of its products backward compatible.
Diaz also said InPhase announced today that it has signed a partnership with jukebox manufacturer DSM Handhabungssysteme GmbH & Co. KG in Germany, to adapt its holographic drives to a library system. Diaz said DMS plans to have a holographic library out in early 2008 with up to 675TB of capacity. InPhase is also working with a number of other tape library manufacturers to adapt their technology.
Each holographic disk actually holds up to 600GB. Diaz said the remaining 300GB not being used for data storage is taken up by error correction software. "We have lots and lots of data redundancy. Because we're going after the archive market we've really erred on the side of caution in terms of data recovery," said Liz Murphy, vice president of marketing at InPhase.
To a backup server the holographic disc looks like a drive letter, allowing users to drag and drop files, Murphy said.
"We've also tried to make as easy to integrate as possible from a software perspective. So it can emulate a DVD, CD-R, magnetic optical disc or tape drive. So software companies don't have to do any major changes to write to it in native mode," Murphy said.
The optical platters are encased in a 5.25-in. square casing that looks like a floppy disk, except that they're 3 millimeters thick. The platter itself is 1.5 millimeters thick and data is written as a holographic image throughout the substrate of the disc.
Unlike CDs and DVDs where data is written on the surface, data is written throughout the substrate of the disc, meaning scratches, dust or dirt have little effect on data retrieval, Diaz said.
At $18,000 for a holographic disk drive, InPhase has priced its product roughly mid-point between a $30,000 enterprise-class tape drive and midrange tape drives such as LTO tape drives, which go for around $4,000. The holographic platters will retail for $180 each.InPhase was spun off the technology from Bell Laboratories in 2000. The company plans to sell the product through resellers. Hitachi Maxell Ltd. will be manufacturing the tapestry line of holographic discs with photopolymer materials from Bayer MaterialScience AG.
Diaz said he already has orders for the product from big-name organizations such as Turner Broadcasting System Inc., the U.S. Geological Survey and Lockheed Martin Corp.
Sunday, February 18, 2007
The Mars Mission Continues
Q:What exactly is Java's role in the Mars Rover mission?
A: The places where NASA scientists have used Java for this mission is all on the ground side right now. They have created this collaborative command and control system called Maestro, which does this combination of data visualization, collaboration, command and control. It lets them look at images and create 3-D reconstructions of terrain. It allows various experimenters to look at the scenes and topography, browse the image databases and take part in all the participation they need to do. And to do it in a remote, distributed and collaborative kind of way - so they could actually have scientists at institutions all over the world who are not only looking at the data, but also collaboratively deciding on the way the mission should proceed. One of the nice things that the JPL guys have done is that they've made a "cleaned up for civilians" version of this application available that's called Maestro.
Q:How is the Java assisting in controlling the Rover from earth?
A: There's a Java API called Java Advanced Imaging, that's used for the images captured by the panoramic camera - the one that producing images with excruciating detail. Those panoramas are being created by combining images from two different cameras onboard the Rover, so with the two lenses they get two images - just like you've got two eyes - so you can do a stereoscopic mapping where your brain is able to figure out how far away things are. Because they've got these stereoscopic images, they can go through a process that's called stereo-image correlation, so they can calculate for each pixel in the image how far away that picture element really is.
With this information, the JPL scientists can calculate how far away each rock is, each picture element, for all of the millions of pixels in one of these large images. So you can get the depth of the image at every point. That's what a stereo camera gives you. When you've got the depth information, you can then actually build a 3-D model, the actual model of the terrain. And then you can actually map the image back onto the 3-D model. So then what you've got is a colored, three-dimensional model of the world around you.
Q:Are they actually commanding the Mars Rover with Java?
A: For the command and control system, big parts of it are this rather large Java application. There are a lot of parts involved in this. The Rover itself has a computer onboard. There's no Java in that computer now. But on the ground-side, there are a number of parts of the whole command and control chain that goes out to the Rover that's done in Java. It's not like every last piece of every subsystem is based on the Java code. Great big pieces of it are. In particular, all the data visualization, user interface front-end stuff and I believe a whole lot of the database stuff is.
Q:How does the public version of the Maestro application work?
A: If you go to the Maestro website you will find that they've got two sets of downloads. One is the Maestro application itself, and the other is a first teaser set of data from Mars. There are different versions for different platforms. There's one for Solaris, a version for Linux, there's a version for Windows and more. The fact that they've got all those versions just shows you how portable Java is, how cross-platform it is. It's exactly the same program in all of those, they are just packaged differently.
So when you download the first set of data. There's a script that walks you through looking at some of the data. Using the 3-D model they have there, and using your mouse, you can actually manipulate the 3-D model and you can get a view as if you're standing off to the side of the landing looking back at it. You can actually wander around the landing site. You can see the rocks. You can see one of the places where one of the air bags didn't deflate completely. All of this 3-D, walk-through visualization is using standard Java APIs like Java 3-D API, Java Advanced Imaging API, Java networking APIs and the user interface APIs.
Using the Maestro version they are distributing, not only do you see a 3-D model of the terrain, you see a 3-D model of the Rover. You can actually drive the Rover on this simulated terrain. It has this "video game" aspect to it. Except that you're actually driving it on a terrain model that's real. It's real Mars data that's constructing this terrain. It's not like playing Dune, where you're going through this maze that's completely fictitious.
Q:How has it been to work with the JPL scientists on this project?
A: I've spent a good amount of time down there with JPL, not only interacting with some of them, but I'm also on one of their advisory boards. In terms of talent density, IQ points per square meter, it's just an amazing place. Plus, they are doing things that most people would think of as science fiction. Most people read science fiction stories about driving dune buggies on Mars. These guys actually build them. They actually know how to fly between the planets. I've spoken to some of the guys that do interplanetary navigation, and that is really spooky stuff. You actually have to pay attention to relativity, the fact that time is not a constant - the faster you go, the slower things are. They function in a world where relativity actually matters. They are way outside of Newtonian mechanics.
JPL is a place to go to have your mind blown - partly, because they are really charged up about what they do. This is a crowd of people who are living a dream. What they are doing is so out there, so wonderful. They are doing something that is very heroic, noble, exploratory and exciting. They are the only part of the U.S. government that I can really get excited about. NASA has this incredible public outreach program, because they know that they are loved and it's a tremendous public service. They do lots of stuff with schools. The fact that they put this stripped-down version of Maestro out there is a wonderful piece of public outreach.
Q:What is it about Java that makes it so attractive to this type of application?
A: The answer is there's a bunch of things, not just one thing. One of the aspects of Java that was really important to them is that it runs on a lot of platforms. If you look at JPL, they've got Solaris boxes, Linux boxes, Windows boxes, Apple boxes - it works on all of them. If you look at the standard available API libraries available for Sun, there's a huge toolkit of things that you bring to bear. There are things like the 3-D modeling APIs, the Advanced Imaging APIs, and all the user interface APIs and networking APIs. The JPL guys used all of them. They were able to leverage all of these standard tools.
Plus, there's been a lot of experience with Java where folks have measured things like developer productivity. For example, if you compare how long it takes someone to develop a piece of software in Java versus C or C++, essentially all of the measurements show it to be about twice as effective. So if it takes some team of engineers ten months to do something in C++, it will take them five months to do it in Java. For an outfit like JPL that does everything on a shoestring budget, that also means you can trim it down five months - or still do it in ten months, but trim it down to half as many people.
There are also a number of aspects to Java that are all about building more reliable applications. It is a lot easier to build things that are more reliable - that break less often - so you don't have to worry about blowing your machine to bits. There's also a lot in Java that's all about security. So when you've got something like their large databases - whose integrity is something they need to be careful about - security is important. So, it's a whole bunch of things that all swirl together. Whenever you talk to Java developers, you get a different answer to the "why did you do it Java" question. Although, there tends to be a standard set of themes.
Q:Is this the first time that Java has been used for this type of application - being beamed out into space?
A: Actually, I don't think so. I've talked to people that have been doing various things with satellite ground station control and people that control systems for giant telescopes. People are doing stuff like this all over the place. The JPL project is what people tend to talk about because it's too damn cool for words.
Java Technology and the Mission to Mars
By Dr. James Gosling
CTO, Developer Programs
Sun Microsystems
Monday Jan 5, 4:00 PM PT
This weekend I saw what has to be the coolest Java(TM) app ever. We talk about "Java everywhere," which usually means Java(TM) technology in vehicle navigation, imaging, control consoles and things like that. This time "everywhere" means all of that and then some, because Java technology is playing a big role in NASA's latest, highly successful, Rover Mission to Mars.
On Saturday night I watched as a control room full of immensely tense geeks explode with joy as they successfully landed the six-wheeled Mars Rover "Spirit" on the planet's surface. From there, scientists at NASA's Jet Propulsion Laboratories in Pasadena will get to use their powerful, Java technology-based, ground side control system to maneuver Spirit on the Martian terrain in what has to be the most amazing network "video game" in history.
Java 3D and Java Advanced Imaging technology are also key to the software JPL is using to render and interpret realtime images captured by the Rover. NASA has even made a stripped down version of the software that you can download so you can view a simulated 3D landscape and drive the Rover around in it.
There are all kinds of reasons JPL is using Java technology for control and imaging systems for the Rovers. NASA engineers had access to hundreds of specialized APIs and network protocols that they needed to bring this off. They got great productivity and reliability. The data they're collecting through this program is part of a distributed, collaborative network of scientists and engineers, and the ability of Java technology to run on any platform enables a secure, reliable global dialogue within NASA's scientific community.
Now that Java has helped us get to Mars, who knows what "Java Everywhere" will mean in the future?

Sean O'Keefe, administrator of the National Aeronautics and Space Administration, celebrates after the staff in the Mars Mission Control Room receives a signal that the Mars Rover has landed safely on Mars.

Mars mission controllers, Stan Thompson, foreground, and Bill Currie, prepare for the long evening ahead in the Mars Mission Control Room.
Wednesday, February 14, 2007
Sunday, February 11, 2007
Saturday, February 10, 2007
Beginer's All-purpose Symbolic Instruction Code
Prior to the mid-1960s, computers were extremely expensive tools used only for special-purpose tasks. A simple batch processing arrangement ran only a single "job" at a time, one after another. During the 1960s, however, faster and more affordable computers became available. With this extra processing power, computers would sometimes sit idle, without jobs to run.
Programming languages in the batch programming era tended to be designed, like the machines on which they ran, for specific purposes (such as scientific formula calculations or business data processing or eventually for text editing). Since even the newer less expensive machines were still major investments, there was strong tendency to consider efficiency to be the most important feature of a language. In general, these specialized languages were difficult to use and had widely disparate syntax.
As prices decreased, the possibility of sharing computer access began to move from research labs to commercial use. Newer computer systems supported time-sharing, a system which allows multiple users or processes to use the CPU and memory. In such a system the operating system alternates between running processes, giving each one running time on the CPU before switching to another. The machines had become fast enough that most users could feel they had the machine all to themselves. In theory, timesharing reduced the cost of computing tremendously, as a single machine could be shared among (up to) hundreds of users.
The original BASIC language was designed in 1963 by John Kemeny and Thomas Kurtz and implemented by a team of Dartmouth students under their direction. BASIC was designed to allow students to write programs for the Dartmouth Time-Sharing System. It intended to address the complexity issues of older languages with a new language design specifically for the new class of users time-sharing systems allowed — that is, a less technical user who did not have the mathematical background of the more traditional users and was not interested in acquiring it. Being able to use a computer to support teaching and research was quite attractive enough. In the following years, as other dialects of BASIC appeared, Kemeny and Kurtz' original BASIC dialect became known as Dartmouth BASIC.
The eight design principles of BASIC were:
1. Be easy for beginners to use.
2. Be a general-purpose programming language.
3. Allow advanced features to be added for experts (while keeping the language simple for beginners).
4. Be interactive.
5. Provide clear and friendly error messages.
6. Respond quickly for small programs.
7. Not require an understanding of computer hardware.
8. Shield the user from the operating system.
The language was based partly on FORTRAN II and partly on ALGOL 60, with additions to make it suitable for timesharing. It had been preceded by other teaching-language experiments at Dartmouth such as the DARSIMCO (1956) and DOPE (1962 implementations of SAP and DART (1963) which was a simplified FORTRAN II). Initially, BASIC concentrated on supporting straightforward mathematical work, with matrix arithmetic support from its initial implementation as a batch language and full string functionality being added by 1965. BASIC was first implemented on the GE-265 mainframe which supported multiple terminals. Contrary to popular belief, it was a compiled language at the time of its introduction. It was also quite efficient, beating FORTRAN II and ALGOL 60 implementations on the 265 at several fairly computationally intensive programming problems such as maximization Simpson's Rule.
The designers of the language decided to make the compiler available without charge so that the language would become widespread. They also made it available to high schools in the Dartmouth area and put a considerable amount of effort into promoting the language. As a result, knowledge of BASIC became relatively widespread (for a computer language) and BASIC was implemented by a number of manufacturers, becoming fairly popular on newer minicomputers like the DEC PDP series and the Data General Nova. The BASIC language was also central to the HP Time-Shared BASIC system in the late 1960s and early 1970s. In these instances the language tended to be implemented as an interpreter, instead of (or in addition to) a compiler.
Several years after its release, highly-respected computer professionals, notably Edsger W. Dijkstra, expressed their opinions that the use of GOTO statements, which existed in many languages including BASIC, promoted poor programming practices.[2] Some have also derided BASIC as too slow (most interpreted versions are slower than equivalent compiled versions) or too simple (many versions, especially for small computers left out important features and capabilities).
Notwithstanding the language's use on several minicomputers, it was the introduction of the MITS Altair 8800 microcomputer in 1975 that provided BASIC a path to universality. Most programming languages required more memory (and/or disk space) than were available on the small computers most users could afford. With the slow memory access that tapes provided and the lack of suitable text editors, a language like BASIC which could satisfy these constraints was attractive. BASIC also had the advantage that it was fairly well known to the young designers who took an interest in microcomputers. Kemeny and Kurtz's earlier proselytizing paid off in this respect. One of the first to appear for the 8080 machines like the Altair was Tiny BASIC, a simple BASIC implementation originally written by Dr. Li-Chen Wang, and then ported onto the Altair by Dennis Allison at the request of Bob Albrecht (who later founded Dr. Dobb's Journal). The Tiny BASIC design and the full source code were published in 1976 in DDJ.
In 1975, MITS released Altair BASIC, developed by Bill Gates and Paul Allen as Micro-Soft. The first Altair version was co-written by Gates, Allen and Monte Davidoff. Versions of Microsoft BASIC soon started appearing on other platforms under license, and millions of copies and variants were soon in use; it became one of the standard languages on the Apple II (based on the quite different 6502 MPU). By 1979, Microsoft was talking with several microcomputer vendors, including IBM, about licensing a BASIC interpreter for their computers. A version was included in the IBM PC ROM chips and PCs without floppy disks automatically booted into BASIC just like many other small computers.
Newer companies attempted to follow the successes of MITS, IMSAI, North Star and Apple, thus creating a home computer industry; meanwhile, BASIC became a standard feature of all but a very few home computers. Most came with a BASIC interpreter in ROM, thus avoiding the unavailable, or too expensive, disk problem. Soon there were many millions of machines running BASIC variants around the world, likely a far greater number than all the users of all other languages put together.
There are more dialects of BASIC than there are of any other programming language. Most of the home computers of the 1980s had a ROM-resident BASIC interpreter.
The BBC published BBC BASIC, developed for them by Acorn Computers Ltd, incorporating many extra structuring keywords, as well as comprehensive and versatile direct access to the operating system. It also featured a fully integrated assembler. BBC BASIC was a very well-regarded dialect, and made the transition from the original BBC Micro computer to more than 30 other platforms.
During this growth time for BASIC, many magazines were published such as Creative Magazine that included complete source codes for games, utilities, and other programs. Given BASIC's straightforward nature, it was considered a simple matter to type in the code from the magazine and execute the program. Different magazines were published featuring programs for specific computers, though some BASIC programs were universal and could be input into any BASIC-using machine.
Many newer BASIC versions were created during this period. Microsoft sold several versions of BASIC for MS-DOS/PC-DOS including BASICA, GW-BASIC (a BASICA-compatible version that did not need IBM's ROM) and QuickBASIC. Turbo Pascal-publisher Borland published Turbo BASIC 1.0 in 1985 (successor versions are still being marketed by the original author under the name PowerBASIC).
These languages introduced many extensions to the original home computer BASIC, such as improved string manipulation and graphics support, access to the file system and additional data types. More important were the facilities for structured programming, including additional control structures and proper subroutines supporting local variables.
However, by the latter half of the 1980s newer computers were far more capable with more resources. At the same time, computers had progressed from a hobbyist interest to tools used primarily for applications written by others, and programming became less important for most users. BASIC started to recede in importance, though numerous versions remained available. Compiled BASIC or CBASIC is still used in many IBM 4690 OS point of sale systems.
BASIC's fortunes reversed once again with the introduction of Visual Basic by Microsoft. It is somewhat difficult to consider this language to be BASIC, because of the major shift in its orientation towards an object-oriented and event-driven perspective. While this could be considered an evolution of the language, few of the distinctive features of early Dartmouth BASIC, such as line numbers and the INPUT keyword, remain.
Many BASIC dialects have also sprung up in the last few years, including Bywater BASIC and True BASIC (the direct successor to Dartmouth BASIC from a company controlled by Kurtz). Recently, the remaining community using Microsoft's pre-Visual Basic products have begun to switch wholesale to FreeBASIC, a GPLed compiler which has been in the process of moving BASIC onto a GCC backend. Many other BASIC variants and adaptations have been written by hobbyists, equipment developers, and others, as it is a relatively simple language to develop translators for. An example of an open source interpreter, written in C, is MiniBasic.
The ubiquity of BASIC interpreters on personal computers was such that textbooks once included simple TRY IT IN BASIC exercises that encouraged students to experiment with mathematical and computational concepts on classroom or home computers.
Sample Code
10 INPUT "What is your name: "; U$
20 PRINT "Hello "; U$
30 REM
40 INPUT "How many stars do you want: "; N
50 S$ = ""
60 FOR I = 1 TO N
70 S$ = S$ + "*"
80 NEXT I
90 PRINT S$
100 REM
110 INPUT "Do you want more stars? "; A$
120 IF LEN(A$) = 0 THEN GOTO 110
130 A$ = LEFT$(A$, 1)
140 IF (A$ = "Y") OR (A$ = "y") THEN GOTO 40
150 PRINT "Goodbye ";
160 FOR I = 1 TO 200
170 PRINT U$; " ";
180 NEXT I
190 PRINT
Sunday, February 04, 2007
'You've got to find what you love,' Jobs says
I am honored to be with you today at your commencement from one of the finest universities in the world. I never graduated from college. Truth be told, this is the closest I've ever gotten to a college graduation. Today I want to tell you three stories from my life. That's it. No big deal. Just three stories.
The first story is about connecting the dots.
I dropped out of Reed College after the first 6 months, but then stayed around as a drop-in for another 18 months or so before I really quit. So why did I drop out?
It started before I was born. My biological mother was a young, unwed college graduate student, and she decided to put me up for adoption. She felt very strongly that I should be adopted by college graduates, so everything was all set for me to be adopted at birth by a lawyer and his wife. Except that when I popped out they decided at the last minute that they really wanted a girl. So my parents, who were on a waiting list, got a call in the middle of the night asking: "We have an unexpected baby boy; do you want him?" They said: "Of course." My biological mother later found out that my mother had never graduated from college and that my father had never graduated from high school. She refused to sign the final adoption papers. She only relented a few months later when my parents promised that I would someday go to college.
And 17 years later I did go to college. But I naively chose a college that was almost as expensive as Stanford, and all of my working-class parents' savings were being spent on my college tuition. After six months, I couldn't see the value in it. I had no idea what I wanted to do with my life and no idea how college was going to help me figure it out. And here I was spending all of the money my parents had saved their entire life. So I decided to drop out and trust that it would all work out OK. It was pretty scary at the time, but looking back it was one of the best decisions I ever made. The minute I dropped out I could stop taking the required classes that didn't interest me, and begin dropping in on the ones that looked interesting.
It wasn't all romantic. I didn't have a dorm room, so I slept on the floor in friends' rooms, I returned coke bottles for the 5¢ deposits to buy food with, and I would walk the 7 miles across town every Sunday night to get one good meal a week at the Hare Krishna temple. I loved it. And much of what I stumbled into by following my curiosity and intuition turned out to be priceless later on. Let me give you one example:
Reed College at that time offered perhaps the best calligraphy instruction in the country. Throughout the campus every poster, every label on every drawer, was beautifully hand calligraphed. Because I had dropped out and didn't have to take the normal classes, I decided to take a calligraphy class to learn how to do this. I learned about serif and san serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great. It was beautiful, historical, artistically subtle in a way that science can't capture, and I found it fascinating.
None of this had even a hope of any practical application in my life. But ten years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts. And since Windows just copied the Mac, its likely that no personal computer would have them. If I had never dropped out, I would have never dropped in on this calligraphy class, and personal computers might not have the wonderful typography that they do. Of course it was impossible to connect the dots looking forward when I was in college. But it was very, very clear looking backwards ten years later.
Again, you can't connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future. You have to trust in something — your gut, destiny, life, karma, whatever. This approach has never let me down, and it has made all the difference in my life.
My second story is about love and loss.
I was lucky — I found what I loved to do early in life. Woz and I started Apple in my parents garage when I was 20. We worked hard, and in 10 years Apple had grown from just the two of us in a garage into a $2 billion company with over 4000 employees. We had just released our finest creation — the Macintosh — a year earlier, and I had just turned 30. And then I got fired. How can you get fired from a company you started? Well, as Apple grew we hired someone who I thought was very talented to run the company with me, and for the first year or so things went well. But then our visions of the future began to diverge and eventually we had a falling out. When we did, our Board of Directors sided with him. So at 30 I was out. And very publicly out. What had been the focus of my entire adult life was gone, and it was devastating.
I really didn't know what to do for a few months. I felt that I had let the previous generation of entrepreneurs down - that I had dropped the baton as it was being passed to me. I met with David Packard and Bob Noyce and tried to apologize for screwing up so badly. I was a very public failure, and I even thought about running away from the valley. But something slowly began to dawn on me — I still loved what I did. The turn of events at Apple had not changed that one bit. I had been rejected, but I was still in love. And so I decided to start over.
I didn't see it then, but it turned out that getting fired from Apple was the best thing that could have ever happened to me. The heaviness of being successful was replaced by the lightness of being a beginner again, less sure about everything. It freed me to enter one of the most creative periods of my life.
During the next five years, I started a company named NeXT, another company named Pixar, and fell in love with an amazing woman who would become my wife. Pixar went on to create the worlds first computer animated feature film, Toy Story, and is now the most successful animation studio in the world. In a remarkable turn of events, Apple bought NeXT, I returned to Apple, and the technology we developed at NeXT is at the heart of Apple's current renaissance. And Laurene and I have a wonderful family together.
I'm pretty sure none of this would have happened if I hadn't been fired from Apple. It was awful tasting medicine, but I guess the patient needed it. Sometimes life hits you in the head with a brick. Don't lose faith. I'm convinced that the only thing that kept me going was that I loved what I did. You've got to find what you love. And that is as true for your work as it is for your lovers. Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great work. And the only way to do great work is to love what you do. If you haven't found it yet, keep looking. Don't settle. As with all matters of the heart, you'll know when you find it. And, like any great relationship, it just gets better and better as the years roll on. So keep looking until you find it. Don't settle.
My third story is about death.
When I was 17, I read a quote that went something like: "If you live each day as if it was your last, someday you'll most certainly be right." It made an impression on me, and since then, for the past 33 years, I have looked in the mirror every morning and asked myself: "If today were the last day of my life, would I want to do what I am about to do today?" And whenever the answer has been "No" for too many days in a row, I know I need to change something.
Remembering that I'll be dead soon is the most important tool I've ever encountered to help me make the big choices in life. Because almost everything — all external expectations, all pride, all fear of embarrassment or failure - these things just fall away in the face of death, leaving only what is truly important. Remembering that you are going to die is the best way I know to avoid the trap of thinking you have something to lose. You are already naked. There is no reason not to follow your heart.
About a year ago I was diagnosed with cancer. I had a scan at 7:30 in the morning, and it clearly showed a tumor on my pancreas. I didn't even know what a pancreas was. The doctors told me this was almost certainly a type of cancer that is incurable, and that I should expect to live no longer than three to six months. My doctor advised me to go home and get my affairs in order, which is doctor's code for prepare to die. It means to try to tell your kids everything you thought you'd have the next 10 years to tell them in just a few months. It means to make sure everything is buttoned up so that it will be as easy as possible for your family. It means to say your goodbyes.
I lived with that diagnosis all day. Later that evening I had a biopsy, where they stuck an endoscope down my throat, through my stomach and into my intestines, put a needle into my pancreas and got a few cells from the tumor. I was sedated, but my wife, who was there, told me that when they viewed the cells under a microscope the doctors started crying because it turned out to be a very rare form of pancreatic cancer that is curable with surgery. I had the surgery and I'm fine now.
This was the closest I've been to facing death, and I hope its the closest I get for a few more decades. Having lived through it, I can now say this to you with a bit more certainty than when death was a useful but purely intellectual concept:
No one wants to die. Even people who want to go to heaven don't want to die to get there. And yet death is the destination we all share. No one has ever escaped it. And that is as it should be, because Death is very likely the single best invention of Life. It is Life's change agent. It clears out the old to make way for the new. Right now the new is you, but someday not too long from now, you will gradually become the old and be cleared away. Sorry to be so dramatic, but it is quite true.
Your time is limited, so don't waste it living someone else's life. Don't be trapped by dogma — which is living with the results of other people's thinking. Don't let the noise of others' opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.
When I was young, there was an amazing publication called The Whole Earth Catalog, which was one of the bibles of my generation. It was created by a fellow named Stewart Brand not far from here in Menlo Park, and he brought it to life with his poetic touch. This was in the late 1960's, before personal computers and desktop publishing, so it was all made with typewriters, scissors, and polaroid cameras. It was sort of like Google in paperback form, 35 years before Google came along: it was idealistic, and overflowing with neat tools and great notions.
Stewart and his team put out several issues of The Whole Earth Catalog, and then when it had run its course, they put out a final issue. It was the mid-1970s, and I was your age. On the back cover of their final issue was a photograph of an early morning country road, the kind you might find yourself hitchhiking on if you were so adventurous. Beneath it were the words: "Stay Hungry. Stay Foolish." It was their farewell message as they signed off. Stay Hungry. Stay Foolish. And I have always wished that for myself. And now, as you graduate to begin anew, I wish that for you.
Stay Hungry. Stay Foolish.
Thank you all very much.
Taken from
http://news-service.stanford.edu/news/2005/june15/jobs-061505.html
without any modifications