Sunday, October 30, 2011

Understanding Display "Resolution" (Retina display)

When I first heard the "Retina display" of the iPhone explained, the perplexed reaction I had was "what for?"  By the way, "Retina display" is just a marketing term for a density of greater than 300dpi.

Display area is probably the most expensive piece of real estate in the world after the Disney Store in New York city's Times Square.  When you have a large spreadsheet to analyze, every pixel is priceless.

Why would you want to waste 960x640 of screen real estate at 326dpi?  It's not about discernible (read snobby) people being able to tell the dots when the display density is below 200dpi.

For paper printouts, yes, you would want 300dpi or better.  But for an electronic display which is refreshed at 30 times a second?  For an electronic display showing moving images or video?

My main beef with high resolution displays on cell phones is that the density is too high.  The density should be reduced to make the display more readable when each pixel is put to good use.  Notice I said "put to good use".  The iPhone's beautiful display is not put to good use with the applications we are seeing. See explanation.

The Samsung Galaxy Note has 1280x800 pixels and at a density of 285 dpi.  This works out to a size of 5.3 inches diagonal.  Now, 1280x800 is same or more abundant than the full-size displays of 85% of the laptop computers shipped in the world today!  I could do some serious work on the phone with this amount of addressable and displayable pixels.  The current Galaxy Note goes into all of my pockets comfortably.  At 250dpi, this would give a screen size of 6".  I think this would be better for an all-purpose computer that can be with you all the time.  My eyes, which are more than half a century old each, are ok for such a display.




Smartphone Battery Life

Why do the specifications of smartphones still give the maximum number of talk hours?  And this is usually a huge fantastic number of half a month or more.  And there is no other indicator of battery life.

As we all know, we use the smartphone more as a computer than as a (voice) phone.  If I had wanted a very good voice phone, I would just get a $20 Nokia simple phone.

Smartphone battery life should be stated like laptops.  And no laptop battery can last a day.  Similarly, if you use a smartphone like a computer full time, don't expect it to last a day.

From experience, on the first day you bought your smartphone, your usage will be abnormally high.  You discover that the battery life isn't that "good".  As the days go by, your usage drops to a more realistic rate and you will find that the battery can last about a day.


The iPhone 5

I predict that the iPhone 5 will have a much bigger screen, à la the Samsung Galaxy Note.

Friday, October 28, 2011

Understanding "Resolution" 101

Related: Retina display

The word "resolution" has been so misused that the original definition, if I remember it correctly, actually bears little connection to how it is used nowadays.

Let's start at the very beginning, a very good place to start.

The first step in the communication process (from the screen to the person) is "visual acquisition".  Your eye must be able to acquire the image.  The image must be big and bright enough to trigger photosensitive cells in your retina.  Ignoring brightness, and just considering a single dot to represent the image, there is a minimum size for the dot below which the human eye cannot pick it out.  But this depends on how far away you are from the dot, so a better parameter to quantify this is the angle subtended by the dot to your eye.

Figure 1 - Images of different sizes can subtend the same visual angle, depending on distance

Obviously the minimum subtended angle that can make a dot visible varies from person to person.  And an eagle probably beats every human.

The second step is image recognition.  To simplify, let's forget about pictures and just restrict the discussion to the letters of the English alphabet.  About the minimum number of dots needed to represent each letter of the alphabet and still make every one distinguishable and recognizable (as letters of the English alphabet) is 5 dots horizontal by 7 dots vertical.  If you have examined an old fashioned CRT terminal, an old fashioned moving stock display, or an old-fashioned matrix printer, you would know what I mean.

Figure 2 - Anything less than a matrix of 5x7 would make it difficult to recognize all the letters of the English alphabet as we know them.

The minimum is 5x7.  But that doesn't mean you can't have more.  Having more doesn't make each letter more distinguishable than the next.  Technically, anything more than 5x7 is pure waste.  But artistically, using a bigger matrix for each letter allows you to have nice smooth curves, with minute turns, and so on, to create a more visually appealing character.  Now you know why a typeface like Times Roman looks coarse and ugly on "low resolution" displays.  Serifs are just lots of curves and curves require lots of pixels to achieve a smooth graduating path.

Figure 3 - Letter A drawn in a 50x50 dot matrix and in shades of color (dithering).

The above letter A when seen from a distance where the subtended angle of each dot becomes barely discernible will look like this:

Now that you know the reason, you should not say "Times Roman looks ugly on low resolution displays".  The right thing to say is probably: "Times Roman cannot be rendered adequately if each letter is represented by a small matrix of dots"!

"Resolution" is defined as the number of dots per unit distance.  A typical laser printer output is 300 or 600 dots per inch.  That is high resolution.  The LCD screen that I am using now has about 120 dots per inch.  Times Roman can still be rendered beautifully at a resolution of 10 dots per inch, provided you have a gigantic display.

Next, the displayable quantity (and size).  The monitor I am using now has 1,680 horizontal dots and 1,080 vertical dots, and it is 23 inches diagonal.  I can see so much of a spreadsheet if each character in the spreadsheet is represented in a matrix of 10x14 dots.  Go ahead and do the simple division arithmetic to find out how many rows and columns that is exactly.  A person with a display of 1,024 x 768 would be able to see less of the same spreadsheet than I.  Even if the person has a 500-inch jumbo display with 1,024 x 768, he will still see less of the spreadsheet than my 23-inch display.  For a display capable of showing 1,024x768, the person sees the same amount of information, whether the display is physically 14-inches or 500-inches.  The only difference is that with a larger display, he can see the spreadsheet from further away.  Remember the very first point above about the angle subtended by each dot? (Hint: if you fit the same number of dots onto a bigger screen, naturally each dot will be bigger.)

By quantity or the amount of information, I mean just that.  To explain what I mean, ten letters on a line is twice the amount of information as five letters on a line.  Ten letters on a line on a 14-inch screen is the same amount of information as ten letters filling up a line on a 500-inch screen.  Hope you can grasp this in totality, otherwise please re-read.

I will digress a bit to talk about address-ability.  All the previous discussion assume you have a computer capable of generating that 1024x768 or whatever "resolution" video signal. When you plug the cable to the display, the display input hardware must be able to synchronize with that signal.  That is, it is able to pick out that you are streaming 1024x768 dots per frame.  If a display cannot synchronize with the input signal, the picture (on an analog display) would be jumping or appear as some noise or the display can be damaged.  On a digital displays, usually you will see a blank screen or an informational message telling you what's wrong.  If the video signal is received properly, then the hardware will present that signal onto the display screen.  It is entirely possible that the display can synchronize with a much higher frequency video signal, say to 1920x1280, even if the display can show only, say, 1024x768.  The electronics in the display would "greek" the signal, averaging a few dots of the input signal into one dot for the screen.  So a beautiful image would appear as a compressed smudgy image on this "low resolution" screen. Most projectors are like this. The actual projection optics is expensive and the most common one today is capable of only 1024x768 (up from 800x600 of a few years ago). However, most projectors today can accept input signals of any "resolution" so as not to inconvenience users. How the signal is then projected varies. Some projectors greek them. Others show a viewport of 1024x768 and allow you to pan to see the bigger picture.

Now we come to the true use of the word "resolution".  Laser printers have very good resolution.  It's 300dpi since 1990.  The common standard now is either 600dpi or 1200dpi.  Most people cannot tell the difference (subtended angle again) when the resolution is higher than 300dpi.  Traditionally, displays have the lowest resolution, 100dpi is quite common.  But even 100dpi does not affect reading efficiency if something like ClearType, using graduating brightness to substitute for graduating dots, is used.

Lately, with cell phones, the displays have been increasing in resolution without increasing in size.  The iPhone 4 has 326dpi - 960 x 640 in 3.5" diagonal.  This is wasteful.  Lots of memory and electricity are used to render a lot of dots which cannot be consumed as they are not visible to most people.

In conclusion, for a given amount of displayable dots, if you want to transmit the maximum amount of information, you use the most simple typeface - a cell of 5x7 for each letter.  If you have plenty of dots to spare, then you can have the luxury of showing each letter more artistically with hundreds of dots.  Remember that the size of the display plays no part in the quantity of information you can show if the amount of dots are the same.  A bigger size display simply means you can see it from farther away.


Sunday, October 16, 2011

RDP Client for Mobiles

I have been looking for a good (and cheap) RDP client for my Android phone. I was about to pay $25 for Xtralogic's software when I saw the one from Jump Desktop for only $0.99. Having tested both, I find the Jump Desktop one better than the much more expensive one.

Jump Desktop's price for Apple is $14.99. Better grab Jump Desktop for your Android before the price goes up.

Jump Desktop's RDP client has the following useful and better features:

  • It is both RDP and VNC. But I have not used VNC myself.
  • It is a standard RDP client, but at the same time, if your host is on an internal network and you don't know its external IP address, you can configure Jump Desktop to go via Google for connectivity instead. I haven't tried it myself, but I think the host contacts "Google's servers" (according to the FAQ), your client contacts Google, and Google makes the connection. Neat if it works.
  • It has two "cursors", making use on a small phone screen lots lots easier. There is a big circle which you can drag to move the host's Windows cursor. Touch anywhere else and you can drag to pan the viewport showing the host's desktop or dual touch to zoom. As it is likely that your phone's display is smaller than your host's desktop, without this feature a terminal session is not practical.
  • Popping up the Android's virtual keyboard does not cover the host's desktop. Hence, you can see what you are typing.

One thing I can't find on Jump Desktop is audio options. For one of my hosts, I need to let audio play on the host as it is doing a recording. [Oct 31 Update: the latest version allows you to configure audio options, just like the Windows version.]

Like Xtralogic's, Jump Desktop does not run in the background, and it exits without prompting if you hit the Android back button one too many times. [Oct 31 Update: the latest version has a confirmation dialog before quitting.]

After purchasing Jump Desktop, I discovered Wyse's PocketCloud Remote RDP, which is free.  PocketCloud is almost the same as Jump Desktop except that it cannot zoom by touch.  You have to touch +/- keys on the screen.

I have tested Yongtao Wang's RDP Lite some time ago and I found it too basic. What I am surprised is why Jump Desktop has only 1,000+ downloads when Xtralogic has 10,000+.

In conclusion, if you don't want to spend a single cent, go for Wyse's PocketCloud.  If you can afford a dollar, try Jump Desktop.

For a screen shot, see the dcpromo on two feet.

--
After using RDP on a mobile for a few minutes, I came to realize that it's not like working on a PC.  A PC is true multi-tasking and you can jump from any window to any window.  Working on mobile is a bit restrictive. If I press the Home button, I don't know whether the current app will run in the background or the system will kill it after a while.  Hence, if you are working on a remote session via RDP, and a call comes in, what should you do?


Thursday, October 13, 2011

My Computer

I didn't want to describe my computer until I read this: http://tratt.net/laurie/tech_articles/articles/good_programmers_are_good_sysadmins_are_good_programmers.

First, what I do.  I write code, some of it, everyday.  I administer networks and servers too.

I develop Windows apps, about a hundred thousand lines of C# code so far.  I do ASP.NET apps, more than twenty operational but small web sites.  I do Silverlight too, a LOB one used by two thousand  users.  I have an unmanageable number of Powershell scripts, doing all sorts of funny things which most people would have written a console or Windows app to do, and this is possible because Powershell has full access to the complete .NET Framework class library.  I administer several PCs and about ten servers currently.  I install OS'es with my own two hands from a floppy/CD/DVD/thumbdrive, ie SYSPREP images strictly prohibited, in well over 100 machine instances from DOS 1.1 to Windows Server 2008 R2.  I install SQL Servers too, since version 2005.  There is also the occasional Sharepoint Server which I dislike.  Oh yes, I do Active Directory as well.  I install and operate my own on premise Exchange Server 2010, with one email account, mine.  I build networks.  Right now I am running six in six different locations with five of them using HSDPA for WAN access.  I have Virtual PC on my primary machine with five images, but I seldom switch them on because performance is disappointing.  I monitor about 20 batch jobs daily.

The machine that is with me all the time (meaning at times it could be in the trunk of the car, but always reachable at short notice) is a 17-inch notebook with 1600x900 addressability and displayability.  At three places I frequent, I have a positioned a 1680x1050 monitor.  So for >90% of the time, my notebook is running with a desktop of about 3200x1000 pixels.  I wish for more.

I disagree with Mr Tratt respectfully that a notebook is not powerful enough (instead see why it's an overkill). I also disagree that a notebook is not ergonomically sound.  I have not used a mouse since November 1996.  But I have not yet met another person who can operate a computer (ie access applications, work applications, enter data, manipulate windows, etc) as fast as I can.  Not even anywhere close.  Perhaps my circle of acquaintances is too small.

At any one time, I have 30+ application windows opened on my notebook, partially overlapping one another.  I have three Powershell consoles and two cmd.exe consoles, only one in elevated mode and with a danger red background.  On a busy day my RJ-45 is connected to one (restricted) network, my wi-fi is my route to the Internet, and I have two VPN tunnels to two distant networks.  Typically I have six terminal service (TS) sessions connected to some of my remote computers.  The good thing about TS is that I could be disconnected by a calamity such as an earthquake.  The next time I reconnect I am exactly at the middle of the down motion of the mouse click when I was disrupted.

Another thing, I have not shut down my computer since November 1996, that's when I got my first notebook.  I have upgraded eleven times since, but I have not shut any of the eleven notebooks down for the purpose of shutting it down.  Sure, I have to power it off to do hardware repairs and all that.  But I have not shut it down because I go to bed or drive somewhere or board a plane.  My computer is always on Sleep mode (not Hibernate) when I am on the move.

So, whenever I put my computer out of Sleep, it is exactly where it was previously.  I reboot my notebook once in two to three weeks at the point where it starts to behave weirdly.  I couldn't go longer than that, even with Windows 7.  I detest rebooting, because it would mean having to re-launch all my applications all over again and losing all the keyboard command buffers in my console sessions (I am that lazy).

Do you think I am insane?

Nov 2011 update: My mobile phone is the Samsung Galaxy Note.  It has a 1280x800 display.  This works nicely as all my remote desktops are at 1024x768.  See here why the Note display is a bit small.  See here why mobile computing is different from working on a notebook PC.  With the Note, finally I have the ability to access everything I want to access, anytime, even while not sitting down.  This is cloud computing in its purest form.

Wednesday, October 12, 2011

The case for an iPad

Do you know of anyone who has an iPad but not another personal computer?

Do you know of anyone who has an iPad but not a mobile phone?

I think the answers are overwhelmingly no for both questions.

Managing three devices is crazy, cloud services notwithstanding.  You spend more time managing the devices than using them.  It's not about enthusiasm.  It is just plain showing off.

A tablet computer is a great, probably the greatest, computing device for the situations when you can't be sitting down.  When you are not sitting down, you cannot do serious work.  Hence, an iPad is just that, for not serious work.  It is an expensive toy, just like those expensive hand bags with names you can't pronounce.

But what the heck, we need spenders like that to keep any economy functioning.

Today's iOS or Android is essentially a single tasking system.  If you are reading a document, you can't be looking at another document to compare the two.  If you are in the middle of an RDP session and a phone call comes in, you are stuck, like a deer in the headlights.  This is a massive retrogressive step since Lisa or Windows 1.1.  Until this changes, the notebook PC will still be the main workhorse for everybody.

Like expensive handbags, there will be many who will die to get an iPad.  Like expensive handbags, there will be many who would die than be seen holding an iPad.

Related: What is touch

Nov 2012 update: This secret is finally out: http://www.digitaltrends.com/mobile/tablets-are-for-old-people-women-and-those-who-like-playing-games-apparently/.