arrow-forward arrow-back hamburger facebook linkedin medium twitter modal-dismiss caret menu cross telephone map-marker
Skip to main content

Blog

Cell Phone Reception Bars

default blog post image

Cell Phone Reception Bars

I’m a little late to the party on this topic.  When Apple was dealing with the iPhone 4 antenna issue, ‘Antennagate’ back in July it drew my attention to that ubiquitous little graphic tucked in the top left corner, and I started thinking that…

It’s Wrong

The graphic is wrong because it’s a bar chart.  Bar charts are used to display and compare the values of a discrete data set.  Using a bar chart is the wrong choice for two reasons.
First, bar charts have two axis, one for the discrete data set, and the other for the value of each of those data points.  Cell phone reception has only one axis: how much bandwidth do I have out of my phone’s total possible bandwidth?

Signal Strength?

Secondly, bar charts are for discrete data sets, where our cell phone reception is a continuum.
If this is correct, then wouldn’t something like this be more accurate?

Signal Strength

I can see and understand my reception quality, but how do I communicate it?  What out of what?

Signal Strength

Better: ‘I’ve got three out of 5 bars’

Better yet: More visually distinct, and I can still see my total of 5.  It’s best not to rely on color change to indicate an item’s different state, so use color and shape.

Signal Strength

Our current cell phone reception bars get this right.

iPhone 4

I’m convinced that the bar chart is wrong, but I’m sure these guys at Apple et al aren’t amateurs, so I try again.

Maybe if we look at the chart closer and break down the values

If I take the bar chart and make the following assumptions: the first bar has a value of one unit; the second equals two units and so on; that the totals are cumulative; and lastly that five bars equals 100%, I can then map that on to a continuum.

Mapped to a Continuum

Looks like all bars are not created equal.  Which takes me on to:

Why it’s right

After a little searching and reading I learned that signal strength and signal performance are not the same thing.
To summarize what I found: the bars represent signal strength, not the ability for the signal to carry your call, your signal quality.  The usable portion of the signal is extremely variable and hard to calculate.

The performance, as in the quality of the reception, only drops significantly when there is a significant drop in signal strength.  Put another way, signal quality plateaus after a certain level of signal strength.
Visualized it may look like this:

Signal Quality

If we map that to the bar chart again:

Map to Bar Chart

Here’s some progress.  Apparently it’s harder to diagnose issues at lower signal strengths.  So by lumping the largest area, the plateau, of the chart into the fifth bar the other four bars can show more detail.  The resolution, detail or ‘information density’ in the left of the chart is increased.

From CNET’s Marguerite Reardon:

Why doesn’t Apple just measure the bars in a linear fashion so that each bar represents an equal share of decibels? Because the range is so big, it’s harder to diagnose problems at lower signal strengths. Signal strength measurement doesn’t need to be very granular at the top end of the scale because performance is only affected when it drops off considerably. But more granularity is needed in the lower part of the scale.

Read more at CNET…

The challenge and why it’s still wrong.

To explore the technical aspect further uncovers more complication:
CNET…
AnandTech (these guys have a terrific set of articles on this issue, if you’re game.)
MetaFilter

As I mentioned above, the bars represent signal strength, not the ability for the signal to carry your call.  We wrongly assume the bars are telling us about the ability of the signal to carry our call.   You can be standing under a cell tower and have five bars, but very poor reception because the tower is overloaded with other users.
There’s more. The lack of an industry standard against which to measure reception quality undermines our ability as users to communicate about reception quality on different devices – your device’s 3 bars may not equal my 3 bars.

I feel the designers of the reception graphic have given a good answer a difficult problem.  Since signal strength is apparently the only reliable data available, and we know that at roughly x signal strength we’ll start having noticeable quality loss lets focus the chart there.

What has been lost though, is an understanding of why and how people (us) use this information.  It’s 10:30pm on a Wednesday, so I’m not about to start any user research, but I can think of how we talk about reception:

‘I’ve only got one bar’

‘I’ve got three bars, should be fine’

‘Five bars’.

‘No reception’

‘Bad reception’

‘Good reception’

‘Oh, that’s a dead zone’.

I hope that sounds familiar.  I notice two things there: no frame of reference, we assume it’s ‘…out of 5 bars’; and a near binary reference of off and on, or good reception and no reception, with bad reception in between.

So to conclude, and make a little contribution, could a simple traffic light style style indicator work?

Traffic Light

Admin

Contact icon

Let's talk about your needs and how we can help. Get in touch

Contact icon

Interested in joining the DesignMap team? Let us know