Tuesday 22 December 2009

Testing email posting

This is my first email post, and I just thought I should say happy Christmas to any readers out there!

Saturday 12 December 2009

Portable technology I didn't know existed

I am typing this on a Nokia N770 Tablet in my Kitchen. The device is connected by Wifi to my home network, and the keyboard I am using is an iGo Stowaway I bought on ebay for 39 pounds sterling. You'll notice I didn't use a pound sign - that's a flaw I currently have with my techno setup. However, I digress into the world of needing keyboard drivers another time.

This keyboard connects to the tablet via bluetooth, and provides me with a complete touch typing experience on a 4" handheld device.

Speedwise, the N770 has what would be considered a paltry 250MHz ARM926 processor. The thing that will amaze and shock all of you techno-petrol heads is that the experience of surfing the web and typing on this is comparable to doing the same on any other device I own with a keyboard. The experience is certainly superior in almost every way to that which I have typing on my Psion Series 5. It is a somewhat hunched experience, having to get in close to what is quite a high resolution screen for such a small size - proportionally, I think the pixels are smaller on this screen than on any computer monitor, making it harder to see what one is writing.

I have found that adding a 1GB RS-MMC card to the tablet affords me the luxury of doubling the amount of memmory the device has to 128MB. This means more applications can be opened and swapped onto the memory card.

Overall, I am able to do most of things I can on any other computer, and I can fit both keyboard and tablet into a coat pocket with room to spare.

Next time, I will talk about my experiences with the Samsung ST1000 GPS/Wifi/Bluetooth touch screen, gesture driven all singing all dancing camera.

Saturday 21 November 2009

some thoughts on the dual nature of the universe

For most of our waking times, we look at the world around us and see nothing but the usual “this and that”. To ask someone if they truly understood the world around them would be met with strange looks. But do we really know anything about this world? Do we know enough to really be able to say that the principles which govern its mechanics will truly be tied up in one neat formula?

The concept of duality – a paradigm particularly revered by the Egyptians – underpins our very existence to the extent that we cannot perceive it. It is a factor so deeply engrained in the fabric of the universe that, were we to actually be granted an understanding of it, we would not be able to handle its true power. I do not profess to understand duality or its depth. My writings here are a simple mind piece to exercise the brain.

Duality is the natural need for all things to have an opposite. It could be argued that, without an opposite, some things would be imperceptible, and others would run amok. It could also be argued that, were some things not to have an opposite, that our ability to perceive the world around us would indeed have evolved differently to compensate for the absence. Can it be said that the world around us presents the epitome of perfectly balanced opposition?

Take, as an example, the simple duality of night and day. If night were not to exist, and we were to live only in daytime, our ability to perceive time and to carry out our sleep cycles would be governed by very different processes to how they are today. All life would be affected, plants would suffer an over abundance of sunshine and people would be forced to spend half the day away from the blazing heat. The global environment would probably be different, where more sunshine would create more clouds, which in turn would block more sunshine, balancing out the temperatures somewhat. In this world governed completely by day, what if we were then told the concept of a night, and what it could mean for us, having never had an inkling of such a concept? Would we then be better off, at this later stage of evolution, to have night thrust upon us? Would we find it as hard to perceive night as we would today to perceive a world with either night or day missing?

In order for some things to come into being, they must first reside in space either in a state of opposition to itself or as a pre-formed consequence of opposite forces. Take the example of a book to be written by an author. In order for that book to come into being, the author must oppose the “force” willed upon him by the blank writing medium, and by his own writers block and gaps in his creativity. I know it may seem strange talking of a piece of paper as having a “force” opposing a writer, but this is true. Opposition, in all of its guises comes down to one thing – the need to expend energy and effort in changing a void into an act of creation. It’s a simple fact that I cannot get from here into town without opposing the separation of me from the town. It is a fact that I cannot go out with a girl if there is hatred getting in the way. A snooker player cannot win a match unless he knows how to play snooker.

Anything that requires practice and hard work lie within the realm of opposition forces. War does not happen without opposing sides. Reproduction doesn’t happen without a man and a woman.

An understanding of the force of opposition can be a very useful thing. It is a belief of mine that the Egyptians and other ancient cultures had an understanding of this force, and used that understanding to build temples and run their great civilisations. From a simple engineering point of view, there is no adequate explanation of how the Egyptians built the Valley Temple at the end of the Sphinx causeway – containing multi-hundred ton blocks of limestone, hewn very accurately and hoisted foot decades into the air. Slave labour is not an adequate means of explaining this feat. Where are the pulley systems strong enough to lift the blocks? How did they organise the thousands of men required, and how did they order and attach the ropes? The same goes for the fallacy of using ramps to build the pyramids. In order to build the limestone and granite pyramids with ramps, up which the blocks were dragged, the ramps themselves would have had to be made out of a material AT LEAST as strong as the blocks that were being dragged up it. The reality is, however, that the ramp would have been several times bigger than the pyramid they were building. For a pyramid of height ~800feet, with a light incline of 15 degrees on the stone ramp, this would necessitate a ramp of length 2985 feet. So what – wouldn’t they need a ramp to build the ramp? A sand ramp would be decimated by the stone blocks, whether carpeted on top with wood burgs or not. Yet somehow the Egyptians opposed this force hampering the creation of their legacy, and conquered it. How, I do not know. Since everything has an opposite, they obviously found it and used it to build their great structure.

Saturday 26 September 2009

When parallel worlds collide

As ARM and Intel continue to collide in the mobile computing industry, a very significant step forward has been made by the ARM camp.

ARM announced recently the availability of Osprey, which is a hard macrocell of a dual core Cortex-A9 processor customised for production on TSMCs 40G process. Furthermore, this is open for licensing today, and can be integrated into any System On Chip that a customer wishes to create.

This contrasts highly with the approach taken by Intel. For production of an Atom Soc, Intel has decided to create an archaic 2-chip package, whereby chip 1 (called Lincroft) will house all of the graphics, the CPU and an integrated memory controller and chip 2 (called Langwell) will house all of the IO capability (disk controllers, and however else that is defined). Finally, and if you want to build something with bluetooth, wifi or 3G connectivity then you need yet a third chip, which is called Evans Peak. The key thing here is to notice that as portability goes up, so too does the number of functional packages (Lincroft, Langwell and Evans Peak will be required to make a decent smartphone, only Lincroft and Langwell required to make a larger MID device). This, in the arena of any portability centric device industry, is absurd. The second thing to note is that a device manufacturer is not able to choose what graphics, or which CPU, they have from the intel line-up - whatever Intel says, goes. The only differentiation being given is in the IO Hub (Langwell) to a small degree.

However, the real difference that Intel now faces from the ARM camp is that ARM now have a high performance macrocell of a dual core CPU that outperforms any Atom in a lesser power envelope, that individual companies can take and integrate as they please with whatever IP they fancy, in whatever package they wish. The real trick up the customer's sleeve is that the place and route and floorplanning of the macrocell has already been done by ARM for the customer - all they have to do is take that box and integrate it into their system, and concentrate their efforts on everything else, without having to worry about hitting frequency goals in the processor as well.

Intel's arguement that the mobile industry is fragmented is the main one given for not allowing differentiation w.r.t. the Atom platform. They effectively want to do the same thing in the mobile industry that was done to the PC industry, and create an innovation-stifling set of standards so that system integrators can start differentiating on..... oh, wait, they can't differentiate. A standard foisted on the community that is so restrictive will lead business to only one door - Intel.

The mobile industry has operated under its own steam for years, and has been influenced by very different socio-economic factors to those that drove the standardisation of the PC Industry. Fashion, entertainment and most of all extreme portability are the key-factors driving the industry. And in parallel, standards that enable that to fluorish have emerged. Bluetooth addresses the issues surrounding sharing information in-situ with another person; Wifi addresses the need to connect to the internet in an unfamiliar setting where there may not be a cable that fits your device; 3G is the all encompassing protocol that carries both voice and broadband through the towers of the company that contracted out the device on a payment plan.

Each of the aforementioned standards don't relate in any way to a straight-jacket on how a system can be built nowadays. In fact, companies seek to innovate the way that these standards are integrated into their platforms. And companies have sprung up that create IP for each of these standards in licensable form, with tools that allow system integrators to integrate those standards seemlessly into their systems. A whole architecture of system integration components exists that become more optimal for portability with each passing generation, and introducing a standard integrated platform mastered by only one company will cripple the fantastic work undertaken in these areas by numbers of people across the industry that far exceed the numbers of people employed by any one company.

So it is two very different worlds colliding. One takes a hands off approach, leading to explosive levels of innovation that can only be achieved by such an open model; and the other is driven entirely by the mechanics of domination, to achieve a controlled mono-culture where only those admitted to the party can prosper.

We've seen what happens in the latter case before, and it leads to some decidedly megalomaniacal business practices in order to hang on to your precarious position.

My opinion is that the latter world cannot continue to prosper, and as ARM moves upwards into the higher performance computing envelopes, people will get sick and tired of the strangulation being caused by Intel, and no amount of strongARM (pun intended) tactics will be able to outweigh the power that open innovation models have in getting their foot in the door of new and exciting industries.

Tuesday 11 August 2009

TSMC full of ARM - what about Intel?

According to reports, Freescale, Qualcomm, TI and others are already placing orders with TSMC for massive shipments of chips for Smartbooks that will max out production lines at TSMC and UMC (semiconductor foundry companies).

By November, the lines should be at 100% capacity.

If Intel starts sampling and putting Atom parts into production, won't the TSMC lines already be chock-full of ARM parts from a myriad other customers? How will Intels poor little Atom part get a look in?

Friday 31 July 2009

A bit about AMDs new fabs

Here's a really good article about AMDs new wafer fabs.

Some of the business behind the curtain is exposed, and the motivation behind the move for AMD to become a fab-less semiconductor company is discussed. The main motivation is their battle with Intel, and the future seems bright for Global Foundries as well.

I just wonder how long it will be before AMD becomes an ARM architectural licensee. And I will discuss the impact this would have in another post, because I'm busy right now.

Sunday 26 July 2009

There will be no PA Semi designed ARM core in apple tablet....

....unless they are planning to release the tablet in 2012 (do read on for why)....

I've been reading the hype regarding Apple's tablet computer thingummy, and there seems to be a zeitgeist of opinion that Apple's latest gadget will contain an ARM compatible CPU designed by the team they gained after acquiring PA Semi in April, 2008.

This article claims that a device will appear at the beginning of 2010, and that during its development (over a 4 year period) Apple had considered use of the Atom processor, rejecting it on grounds of power consumption. To bridge this gap, it is claimed, Apple bought PA Semi at the end of April 2008, in order for Apple to have an in-house capability for System On Chip (SOC), as well as to allow it to differentiate it's technology from everyone elses.

At the moment, iPhone teardowns reveal exact component specs, allowing the revelation of every piece of hardware (and, subsequently, the processor). This is how we all know that the iPhone 3GS contains an ARM Cortex-A8 (along with the Apple Job adverts specifying knowledge of the ARM NEON instruction set - Cortex-A8 is currently the only ARM silicon that has a NEON pipeline attached). Apple could - by designing its own chips - keep the content completely secret - something you cannot do when you buy an off-the-shelf component.

Right now, talk of this tablet computer has people speculating that it will contain PA Semi's first CPU design for Apple.

I think this is unlikely.

Apple acquired PA Semi in April 2008. It takes nearly 2.5 years to bring a new microprocessor - from initial spec, to releasable quality - into a form where you could conceivably manufacture it, having validated it enough.

Various stories have emerged over the last year or so speculating that Apple will be using the PA Semi team to design an ARM CPU. Who knows if this is true, or if any of the rumours about this tablet are true?

PA Semi's last exposure to ARM CPUs was when some of the team worked on the StrongARM processor at DEC. This was an ARMV5 architecture processor, a very different beast to todays ARMV7 processors. Time has moved on, and there is a whole different architecture to learn, with many new features. Just to learn this new architecture would take an architect 6-8 months to master, during which time they would model some of the micro-architectural ideas they have floating in their head. This could happen in parallel with some Register Transfer Level (RTL) trialling, but only in a behavioural sense. By the time PA Semi has done all of this, it brings us to the end of 2008, with nothing more than an understanding of the architecture, and various models of the micro-architecture of the CPU.

Even with an installed product design team working on concepts and prototypes for the actual look and feel of the hardware (remember, they'd been working on this for nearly 2.5 years, if the story is true), the actual silicon inside is a very different beast. Assuming that work began on the actual design of the CPU, it would still take nearly two years to get the core to a state where ARM would sign off on their architectural compliance (if, indeed, the rumours of an ARM compatible CPU design are true). This takes us to the end of 2010 for Apple to have bespoke, manufacturable Silicon.

So no, the timescales are all wrong. And remember, the task above just relates to designing the CPU - it doesn't include the time taken to design a SOC around that CPU. They may have access to various designs already, but if they are going to the trouble to work on a bespoke CPU, then they won't at the same time go and take some other SOC - htey'll build a new one.

Their recent share interest in Imagination Technologies graphics processor design house indicates that Apple will be using their technology in their SOC, but without an in-house design for a CPU, where will they get that from?

To design a SOC - into which they would drop an ARM CPU - they would need an ARM CPU license. When they just purchase an off-the-shelf chip package - like the ones they get from Samsung - they do not need this, because Samsung are the ones who have taken out the ARM CPU license and manufactured a SOC with it.

I think the likely first set of tasks that the PA Semi team will be doing is to create a SOC using a pre-existing set of designs that they would license, so as to keep the flow of products emerging from their portfolio.

There are interesting times ahead, but I don't believe that Apple have a custom ARM CPU ready to go for an early 2010 release. If this even has a shred of truth to it, expect an early 2011 release for anything like that.


Saturday 25 July 2009

Thursday 23 July 2009

Reverse alchemy

There was an article on EDN about a venture capitalist who has published details of the many approaches they had in the past -that they turned down - from companies who are now amazingly successful.

They have had their successes (skype, verisign, PA Semi etc), but they also had a few howlers, including turning down fed-ex 7 times, and passing on google when they were starting up (you'll get a kick out of the quote).

The link to the artile is here, and the company address is here.

Wednesday 22 July 2009

Roller-coaster - man skates ricketty death-plunge

Check out this guy, who fabricated a pair of inline skates for the purpose of skating the track on an enormous wooden roller coaster.

When you take adrenaline junkies and mix them with technology, great things usually happen. They don't apply technology to satisfy a craving for intellectual discovery. They do it to test the edge of physical endurance, effectively a form of high physical-intellect.

He said that had a nail been poking out slightly from any part of the track at his top speed (90kph), he'd have been toast.


Monday 13 July 2009

MS Office - who needs a native copy?

According to The Register, Microsofts oft-forthcoming online version of office will have operational components that allow iPhone users to "view and scroll through" office web applications. It is not entirely clear if this actually means documents, or if the Redmond based company is allowing the application to actually run on the iPhone.

The bit that caught my eye, though, was that they demonstrated a variety of platforms -including mobile devices - editting the same documents, where the results of the edits appear the same on each of the devices (presumably after synchronisation).

Does this mean that - at long last - full office applications are going to become available on ARM based devices via some clever integration of cloud apps into the WinCE/WinMo browsers and OS?

All of this would be the logical step for microsoft to address all of the criticism levvied against them regarding their limited support for ARM based devices with their OS. If all of their future software becomes available through the cloud, then that gives fair-game access to most of their apps via many different processor architectures, and even from within other OSes such as Linux.

Sunday 28 June 2009

On a musical genius

This doesn't really fit the ethos of my particular blog, but I have to just say how sorry I am to hear of Michael Jacksons death. I am not a devoted fan, but grew up in the heady years of MJ. I saw the stories people published about him, and never really gave a thought to which part of the mix - his genius or his madness - was the more important. He wasn't really on my radar much, but now that he has died - and taken his gift with him - I've thought about this dilemma a bit more.

Undoubtedly, the musical legacy that he left behind will live on forever. When an artist is alive, it is easy to overlook just exactly what they give to the world. His looks were odd, sometimes his voice was a bit too high pitched for my liking. Taking these things in isolation, MJ seemed a bit quirky, a bit mad. However, take a huge leap in the air and view his tapestry from above and those images and sounds diminish. What you see overall is an incredible manifestation of talent.

From his concerts to his strings of number one hits, to his accompanying video masterpieces, his amazing dancing abilities and his enigmatic style, this was someone that we can all look upon as reaching the very height of mastery of his talent. I cannot look upon anyone from the modern musical age and say that they can match MJ on all of these aspects simultaneously.

And herein lies the real crux of what I am going to say. When a legend is truly living, and giving as much as they do, our appreciation of them seems to unfold as the reciprocal of their talent. We start to examine other aspects of their heady existence, and instead of rejoicing in what their talent gives us, we villify what we find in their private lives. Never do we stop to wonder what impact we may be having on them, and the possibility that by treating them this way, we may be expediting their demise, and the loss of their gift.

This truly happened to MJ. It is only upon looking back - now that he is gone - that we want to partake of his gifts further than we had done when he was alive. Look at the sales of his music now that he is gone! If only the press, and his life, had been more genuine and normal, maybe he'd now be a picture of health - maybe his gifts would once again have kept on giving for us all to enjoy?

Everyone enjoys watching genius at work, and it is this genius that attracts both good and bad into orbit around such people. The bad is an inevitable sideband to an otherwise worthwhile signal, and some people are more able than others to filter that sideband and get on with producing the signal. It is open to question how affected MJ was by his wretched sideband, given the strength of signal he produced. I think he courted the conjurers of his alter-ego, the man in the hyperbaric chamber. His love hate relationship with this schizophrenic persona - half innocent child, half megastar - led to a dichotomy that even he himself could not resolve, and it gave birth to this monster of transfromation that overtook his life.

It is at the crest of a new rebirth that MJ fell victim to his own innocence. At a time when he appeared to be getting back onto the stage, the sideband of greedy people swirling around him came to bring him downonce more. Ten concerts turned into fifty, and he himself complained of this. Yet in his nature, he went ahead with it to avoid disappointing his fans. I wonder if he was at peace with his decision to go back on the road before the magnitude of his commitment hit him?

MJ, I and countless hundreds of millions of fans will miss what you gave, and be thankful that you will no longer have to live in the shadow of all that ailed you anymore.

Rest in peace.

Thursday 25 June 2009

What can Nokia build....

... with Intel that they can't already build with ARM? The only thing is a device that runs full windows, and even then the arguement for doing so seems doubtful. The smartphone space is doing just fine without Intel and isn't suffering too badly under Microsoft. Eventually, Windows Mobile will become mature, and more feature rich, and there is no reason to think that - coming from a different code base to normal windows - it won't prove to be a cleaner, more usable experience, with less bloating and more efficiency.

If there is any compelling reason why nokia would ever use Intel to create a smartphone, I'm waiting to see what it is!

Wednesday 24 June 2009

Turnkey? Reference designs? Who said we want those?

I've been squirming as I watch Intel talk of how they think the mobile gadget future is going to look. They talk about reference designs, and turnkey solutions. These are all fine in certain markets, but they are all edifices of the PC world that hark back to the beginning of computing and the formation of the Microsoft/Intel duopoly. They made standardising the nascent computing market - which, lets face it, needed to be standardised in order for the world to start learning how to take advantage of it - simple, and allowed microsofts OS software become the encumbent very quickly indeed. After all, since every computer would look the same, it was easy to make software work on it.

I guess if we were in a world where there wasn't yet the concept of a portable computer (and I'm talking of the smartphone form factor), then we'd need someone to come along and enforce this. But there is already a huge market of portable device platform manufacturers out there, and each of them are happy to be free to be different to everyone else. It gives the entire portable device eco-system the chance to breathe, and innovate.

Companies like Nokia aren't interested in doing business exclusively with companies like Intel. This was made very clear in their joint announcement. Why would Nokia ever want to go to a single source supplier, at a price-point they cannot control or negotiate on, when they are perfectly happy playing off ST Microelectronics and Texas Instruments against each other on pricing, in exchange for winning the order? Competition is healthy, and it allows Nokia - and countless other companies - the freedom to choose where they get their silicon from. It keeps costs down, and encourages silicon suppliers to keep pushing the bleeding edge of system integration.


Tuesday 23 June 2009

Who's eating whom?

A lot is being made in the tech press of this deal between Nokia and Intel, and how it surely represents Intel eating into ARMs market share.

Let's put things in perspective here.

In Taipei (Computex 2009), we saw countless ARM based chipsets being touted by companies like Qualcomm, NVIDIA, Freescale, Samsung, Texas Instruments and others, each boasting a sizeable number of Original Device Manufacturers (ODMs) . These weren't just any slouches, either - Asus, Acer, Pegatron, Inventec etc. Household names in many cases, and typical PC industry stalwarts. These are companies who have been manufacturing laptops forever, or are breaking into the game in a big way. And here they all are, one year on, allowing chipset vendors to display devices bearing their name. Sitting in those booths were missed-design wins for Intel; capacity in factories diverted into manufacturing ARM-based equipment, displacing Intel.

If you like, Taipei was the show that really splattered it on the wall for Intel. They finally saw that their dominance of the computing market is at last being challenged, and all at a time when the world is turning sour for them with one of the largest anti-trust judgements in history levied against them.

I found the following blog on ARMs website, which has some pretty interesting dissection of how the current computer manufacturing business could be changed if choice were introduced into the market place for these ODMs in taiwan.

But away from the ODMs and onto Intel again, who continually claim that standardisation is what they'll bring to the mobile device market. They claim that if they standardise the platform on which mobile devices are made - and by the way, be the only supplier of that platform - then OEMs won't have to work as hard at differentiating their devices. This is tantamount to the death of innovation in the mobile space. If the arguement is purely based on the fact that all of the peripherals in the SOC that they lay down will always be in the same part of the memory map so that an OS will always know where to find them, this arguement doesn't hold any water.

All OSes have layers of software that abstract the underlying hardware (a HAL, or hardware abstraction layer) so that they don't need to care where the graphics processor is, or how much memory it has etc.. Intel are dreaming with this arguement, since the countless ARM silicon providers all have HALs for their given SOC platforms on a number of OSes - it isn't something an ODM or OEM needs to care about. So on that front, Intels arguement may seem clearcut to Intel - it merely requires that every other processor architecture and platform disappear, so that there is only theirs left standing!

With the Atom cutting into the sales of their higher cost laptop processors - adding confusion to the market - and with ARM devices eating their market share come the end of this year, I predict a great fall in Intels profitability at the end of 2010. ARMs business model is less volatile than Intels, and the analysis suggests that - for now - it is Intel who is getting eaten, both by itself, and by ARM.

Intel and Nokia - big deal!

So, Nokia decided they needed a way to survive, and intel came along just at the right time!

Intel are the master of inking meaningless deals. The release of information about Nokia forming a technology partnership with Intel is scant on substance, and from a business perspective I'm wondering who the winners and losers are here.

ARM doesn't really lose anything, because the deal is non-exclusive. Nokia still deals with ARM and its partners in exactly the same way that it always has done. Judging by Nokias results last year, and the way that it is slowly transforming itself from purely a handset maker into a service provider and software platforms company, teaming up with Intel will simply add items to the balance sheet. Nokia doesn't have to do a lot since I suspect Intel will bear most of the development costs, and intel gets their name on a nokia device. Nokia will sell these devices, and make a profit from them.

Intel also gains from this, since they'll have access to lucrative 3G IP necessary to connect their future Atom platforms to mobile telephony.

With the advent of full internet on ARM (see any news search engine, or this blog, for information on flash availability for smartphones in October this year), there is no compelling reason for Nokia to adopt Intel across the board. The last of the holes have been plugged as far as the internet on ARM goes, and the only compelling reason left for Nokia to work with Intel is in creating a Netbook class device. Even then, the cogs are in motion for many ARM based smartbooks to appear on the market before years-end.

The obfuscatory language used around the Nokia/Intel announcement leads one to suspect that they are trying to muddy the waters sufficiently to make poeple think it's all about smartphones, when in fact its probably just another netbook announcement. There have been Nokia netbook rumours floating since netbooks were in their infancy.

Although this is a small victory fo Intel, it is a paper tiger soon to be shrivelled.

Has apple missed the cart? (no flash on iphone)

Adobe announced today that full flash 10 will be making its beta release in october on the ARM architecture. Support will be available up-front for Windows Mobile, Google Android and Symbian.

Great! That's the majority of smartphones and upcoming ARM based smartbooks covered. But what about the iphone?

According to this, Apple claims that the iphone is under powered - from a processor perspective - to cope with flash. At the same time, Adobe has said that Apple are working to their own schedule on flash. (hang with me here - I'm making a point). A third observation is that NVIDIAs tegra (which has a very similar ARM11 processor to the chipset used in the iphone prior to the Iphone 3Gs) makes a delightful rendition of flash. So why can't the iphone?

The processor - even in pre-3GS days - looks to be sufficient, even if NVIDIA have re-written chunks of the flash player to run on their GPU. The iphone chipset pre-3GS had a graphics processor - couldn't Apple be bothered to do the same thing? Or maybe they were just too early to market to have access (under the open screen project) to be there in time to do anything about it.

But now that they have the iphone 3GS, with it's superscalar Cortex-A8 processor and GPU, surely this announcement of support on every ARM mobile platform EXCEPT iphone puts Apple at a disadvantage?

Were apple caught with their pants down? I for one think they have got a version of flash being readied within their walls in cupertino, and that it would be a good thing for Steve Jobs to announce when they next do an update of the iphone OS.

Sunday 22 March 2009

Watch this space

I have been watching with interest the hubbub over Google's Street View here in the UK. Although people probably don't want to be caught walking out of a sex shop in west london at the minute the car-with-a-pole goes by, there is surely a devoted band of people out there trying to make as many appearances as they possibly can.

I noticed that another piece of Google technology - Google Earth - had been used for two very different reasons in the news this week. One was the story of how Google Earth had allowed observers to identify a large scale archaeological feature under the sea off the coast of England (read it here).

The other story was about how a man convicted of metal theft confessed to how he had used Google Earth to identify buildings with large amounts of lead on their roof (read it here).

As if this wasn't enough, you can now play a game on Google StreetView - well, sort of. It has now become fashionable to post sightings of Wally from Where's Wally onto the web. You'll find a sighting here.