Binary Numbers and Computer Speed

Posted by PITHOCRATES - April 2nd, 2014

Technology 101

Computers are Good at Arithmetic thanks to Binary Numbers

Let’s do a fun little experiment.  Get a piece of paper and a pen or a pencil.  And then using long division divide 4,851 by 34.  Time yourself.  See how long it takes to complete this.  If the result is not a whole number take it to at least three places past the decimal point.  Okay?  Ready……..start.

Chances are the older you are the faster you did this.  Because once upon a time you had to do long division in school.  In that ancient era before calculators.  Younger people may have struggled with this.  Because the result is not a whole number.  Few probably could do this in their head.  Most probably had a lot of scribbling on that piece of paper before they could get 3 places past the decimal point.  The answer to three places past the decimal point, by the way, is 142.676.  Did you get it right?  And, if so, how long did it take?

Probably tens of seconds.  Or minutes.  A computer, on the other hand, could crunch that out faster than you could punch the buttons on a calculator.  Because one thing computers are good at is arithmetic.  Thanks to binary numbers.  The language of all computers.  1s and 0s to most of us.  But two different states to a computer.  That make information the computer can understand and process.  Fast.

A Computer can look at Long Streams of 1s and 0s and make Perfect Sense out of Them

The numbers we use in everyday life are from the decimal numeral system.  Or base ten.  For example, the number ‘4851’ contains four digits.  Where each digit can be one of 10 values (0, 1, 2, 3…9).   And then the ‘base’ part comes in.  We say base ten because each digit is a multiple of 10 to the power of n.  Where n=0, 1, 2, 3….  So 4851 is the sum of (4 X 103) + (8 X 102) + (5 X 101) + (1 X 100).  Or (4 X 1000) + (8 X 100) + (5 X 10) + (1 X 1).  Or 4000 + 800 + 50 + 1.  Which adds up to 4851.

But the decimal numeral system isn’t the only numeral system.  You can do this with any base number.  Such as 16.  What we call hexadecimal.  Which uses 16 distinct values (0, 1, 2, 3…9, A, B, C, D, E, and F).  So 4851 is the sum of (1 X 163) + (2 X 162) + (15 X 161) + (3 X 160).  Or (1 X 4096) + (2 X 256) + (15 X 16) + (3 X 1).  Or 4096 + 512 + 240 + 3.  Which adds up to 4851.  Or 12F3 in hexadecimal.  Where F=15.  So ‘4851’ requires four positions in decimal.  And four positions in hexadecimal.  Interesting.  But not very useful.  As 12F3 isn’t a number we can do much with in long division.  Or even on a calculator.

Let’s do this one more time.  And use 2 for the base.  What we call binary.  Which uses 2 distinct values (0 and 1).  So 4851 is the sum of (1 X 212) + (0 X 211) + (0 X 210) + (1 X 29) + (0 X 28) + (1 X 27) + (1 X 26) + (1 X 25) + (1 X 24) + (0 X 23) + (0 X 22) + (1 X 21) + (1 X 20).  Or (1 X 4096) + (0 X 2048) + (0 X 1024) + (1 X 512) + (0 X 256) + (1 X 128) + (1 X 64) + (1 X 32) + (1 X 16) + (0 X 8) + (0 X 4) + (1 X 2) + (1 X 1).  Or 4096 + 0 + 0 + 512 + 0 + 128 + 64 + 32 + 16 + 0 + 0 + 2 + 1.  Which adds up to 4851.  Or 1001011110011 in binary.  Which is gibberish to most humans.  And a little too cumbersome for long division.  Unless you’re a computer.  They love binary numbers.  And can look at long streams of these 1s and 0s and make perfect sense out of them.

A Computer can divide two Numbers in a few One-Billionths of a Second

A computer doesn’t see 1s and 0s.  They see two different states.  A high voltage and a low voltage.  An open switch and a closed switch.  An on and off.  Because of this machines that use binary numbers can be extremely simple.  Computers process bits of information.  Where each bit can be only one of two things (1 or 0, high or low, open or closed, on or off, etc.).  Greatly simplifying the electronic hardware that holds these bits.  If computers processed decimal numbers, however, just imagine the complexity that would require.

If working with decimal numbers a computer would need to work with, say, 10 different voltage levels.  Requiring the ability to produce 10 discrete voltage levels.  And the ability to detect 10 different voltage levels.  Greatly increasing the circuitry for each digit.  Requiring far more power consumption.  And producing far more damaging heat that requires more cooling capacity.  As well as adding more circuitry that can break down.  So keeping computers simple makes them cost less and more reliable.  And if each bit requires less circuitry you can add a lot more bits when using binary numbers than you can when using decimal numbers.  Allowing bigger and more powerful number crunching ability.

Computers load and process data in bytes.  Where a byte has 8 bits.  Which makes hexadecimal so useful.  If you have 2 bytes of data you can break it down into 4 groups of 4 bits.  Or nibbles.  Each nibble is a 4-bit binary number that can be easily converted into a single hexadecimal number.  In our example the binary number 0001 0010 1111 0011 easily converts to 12F3 where the first nibble (0001) converts to hexadecimal 1.  The second nibble (0010) converts to hexadecimal 2.  The third nibble (1111) converts to hexadecimal F.  And the fourth nibble (0011) converts to hexadecimal 3.  Making the man-machine interface a lot simpler.  And making our number crunching easier.

The simplest binary arithmetic operation is addition.  And it happens virtually instantaneously at the bit level.  We call the electronics that make this happen logical gates.  A typical logical gate has two inputs.  Each input can be one of two states (high voltage or low voltage, etc.).  Each possible combination of inputs produces a unique output (high voltage or low voltage, etc.).  If you change one of the inputs the output changes.  Computers have vast arrays of these logical gates that can process many bytes of data at a time.  All you need is a ‘pulsing’ clock to sequentially apply these inputs.  With the outputs providing an input for the next logical operation on the next pulse of the clock.

The faster the clock speed the faster the computer can crunch numbers.  We once measured clock speeds in megahertz (1 megahertz is one million pulses per second).  Now the faster CPUs are in gigahertz (1 gigahertz is 1 billion pulses per second).  Because of this incredible speed a computer can divide two numbers to many places past the decimal point in a few one-billionths of a second.  And be correct.  While it takes us tens of seconds.  Or even minutes.  And our answer could very well be wrong.


Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

It’s not the Traffic that crashed the Obamacare Website but Bad Website Design

Posted by PITHOCRATES - October 6th, 2013

Week in Review

The Obamacare website got off to a less than illustrious start.  That is, it failed to work.  The Obama administration said the website failed because it was overwhelmed by so many people wanting to log on at the same time to get them some Obamacare.  But people who actually know how to design websites said it wasn’t a capacity problem.  The problem is something common whenever the government tries to do anything.  They just did a poor job designing the Obamacare website (see Obamacare architecture flawed: Experts by Reuters posted 10/6/2013 on The Times of India).

Days after the launch of the federal government’s Obamacare website, millions of Americans looking for information on new health insurance plans were still locked out of the system even though its designers scrambled to add capacity.

The U.S. Department of Health and Human Services, which oversaw development of the site, declined to make any of its IT experts available for interviews. CGI Group Inc, the Canadian contractor that built, is “declining to comment at this time,” said spokeswoman Linda Odorisio.

Five outside technology experts interviewed by Reuters, however, say they believe flaws in system architecture, not traffic alone, contributed to the problems.

For instance, when a user tries to create an account on, which serves insurance exchanges in 36 states, it prompts the computer to load an unusually large amount of files and software, overwhelming the browser, experts said.

If they are right, then just bringing more servers online, as officials say they are doing, will not fix the site…

One possible cause of the problems is that hitting “apply” on causes 92 separate files, plug-ins and other mammoth swarms of data to stream between the user’s computer and the servers powering the government website, said Matthew Hancock, an independent expert in website design. He was able to track the files being requested through a feature in the Firefox browser…

[Matthew Hancock, an independent expert in website design,] said because so much traffic was going back and forth between the users’ computers and the server hosting the government website, it was as if the system was attacking itself.

Hancock described the situation as similar to what happens when hackers conduct a distributed denial of service, or DDOS, attack on a website: they get large numbers of computers to simultaneously request information from the server that runs a website, overwhelming it and causing it to crash or otherwise stumble. “The site basically DDOS’d itself,” he said.

Did you catch that?  President Obama and the Democrats have lambasted corporations for outsourcing American jobs.  Called them a whole bunch of nasty names.  Unpatriotic.  Greedy.  You name it.  And yet here they are.  Outsourcing the design of the Obamacare website.  So I guess the Obama administration is unpatriotic, greedy, etc.

Those “92 separate files, plug-ins and other mammoth swarms of data to stream between the user’s computer and the servers powering the government website” does not fill one with a lot of confidence that our private and personal data they’re collecting will be secure.  Complicated systems are more subject to breaking down.  And getting hacked.

If these people who actually know how to design websites are right that means the Obama administration lied to us about what was wrong with the Obamacare website.  Of course, president Obama said if we liked our health insurance plan and our doctor we could keep them.  Which were lies.  It seems like the Obama administration has a habit of lying to the American people to get what they want against our will.

Not only that they sic the IRS on their political enemies.  As well as using the IRS to suppress voter turnout on the right.  By making it harder for Tea Party groups to be politically active like they were in the 2010 mid-term election.  Who rose up in opposition to Obamacare.  Had the Obama administration not use the full weight of their powers to suppress the Tea Party during the 2012 election President Obama would not have won reelection.  And Obamacare may have already been repealed by this time.  For it was conservatives that sat home and didn’t vote for Romney.  Because the Tea Party wasn’t active like in the previous election to motivate them.  Or they were intimidated by the left and were afraid of the IRS.  Further reasons to fear the data mining of the Obamacare website.  Which the IRS will have access to.

Perhaps we should be grateful that they designed the Obamacare website poorly.  For it will let us keep our personal and private information secure for a little longer.


Tags: , , , , , , , , , , , , , ,

Global Warming is more like a Religion than a Science for we must Accept Everything they Tell us on Faith

Posted by PITHOCRATES - December 10th, 2011

Week in Review

We may be on the brink of something great.  Or at another dead end (see What if there is no Higgs boson? by Lisa Grossman posted 12/9/2011 on New Scientist).

Rumour has it they have found hints of the Higgs at a mass of 125 gigaelectronvolts, about 133 times the mass of a proton. What is known for sure, though, is that researchers from the LHC’s main detectors, ATLAS and CMS, will separately present the past year’s worth of data from the proton collider. That represents more than 300 trillion high-speed particle collisions, more than twice the amount of data reported at a conference in August. That is still not enough data to be able to rule the Higgs definitively in or out, but it should be enough to show hints of the Higgs if it exists in the mass range that had previously not been scrutinised.

Now this is science.  They have data from more than 300 trillion high-speed particle collisions and it’s still not enough to prove anything.  But it may give them a ‘hint’ that the Higgs Boson may cross over from the world of theoretical physics to the world of experimental physics.  Proven again and again by their peers throughout the world.  Those with access to large particle accelerators, that is.  But if they don’t see this ‘hint’ then they may discard their model.  And start all over with a different model.

“If we witness a lack of events in the full mass range, then clearly we will start disfavouring the presence of the standard model Higgs boson in LHC data,” says CMS spokesperson Guido Tonelli. “To really exclude it we would need additional data. But if in this amount of data we don’t see any indication that something is happening, the most likely hypothesis is that we have to look for another solution…”

“It’s the job of theoretical physicists to game out all the different possibilities, so that the experimentalists have all the tools that they need when they eventually discover or don’t discover whatever it is the LHC will or will not reveal,” says Ellis.

You see, physicists are real scientists.  You can tell by how they experiment and analyze the crap out of the resulting data.  And their experiments have included up to 300 trillion high-speed particle collisions.  That’s ‘trillion’ with a ‘t’.  Which is a lot.

Interestingly you don’t read anything like this in the global warming ‘scientific’ community.  We don’t hear about data compiled from trillions of experiments.  What we hear are the same things we hear from Al Gore.  A politician.  Who talks politics.  Not science.

And we never hear them questioning their models.  As if their models were not hypothesized by man.  But handed down by God.  And are beyond questioning.  Like a sacred religious text.  You know, when you think about it, global warming is more like a religion than a science.  Because we’re never allowed to question the ‘science’.  But must accept everything they tell us on faith.  Just like in a religion.


Tags: , , , , , , , ,

If only Global Warming Required the Same Level Scrutiny of Real Science

Posted by PITHOCRATES - November 19th, 2011

Week in Review

Now this is science.  A second experiment was done showing particles moving faster than the speed of light.  Yet some are still skeptical (see Second experiment confirms faster-than-light particles by Brian Vastag posted 11/17/2011 on The Washington Post).

While the second experiment “has made an important test of consistency of its result,” Ferroni added, “a final word can only be said by analogous measurements performed elsewhere in the world.”

That is, more tests are needed, and on other experimental setups. There is still a large crowd of skeptical physicists who suspect that the original measurement done in September was an error.

Should the results stand, they would upend more than a century of modern physics.

Imagine that.  Twice now they’ve sped particles past the speed of light in a laboratory environment.  But some are still skeptical.  And yet global warming is a scientific fact.  Even though global warming is pure conjecture.  With no way possible to test this theory in the laboratory.  And no way to review empirical data going back through the previous ice ages to see if this ‘warming’ is any different from warming that pushed glaciers some thousand miles or more.  Because there is no empirical data that predates modern man.

So what’s the difference between these two?  One is science subjected to rigorous testing.  The other is not.  But rather something accepted on faith.  So they can implement environmental regulations.  Create silly things like carbon trading.  So they can collect a lot of fines and fees.  While making our lives more difficult.

If only global warming required the same level scrutiny of real science.


Tags: , , , , , , ,