Data Representation: Millions of Colors

2 Review(s)
658 Download(s)
Rate & Review

Lesson synopsis

crayons
Display devices on cellphones, tablets and computers of all sizes, use bits of information to represent color. By first creating, and then playing a card game, students learn how additive color is represented as binary and hexadecimal numbers. They will also get practice in recognizing and manipulating binary and hexadecimal representations.

Age Levels

14 - 17 years

Objectives

Introduce students to:
number systems used in computing: binary, hexadecimal.
how color is displayed on digital devices.
how and why additive color is represented as a single number.
why there are millions of colors available on mobile devices and computer screens.

Anticipated learner outcomes

Students will demonstrate/explain:
how information is stored in binary, and represented in hexadecimal form.
how additive color is represented in binary and hexadecimal.
how to add and subtract hexadecimal numbers.

Optional Writing Activity

Write a short report summarizing how additive color is represented in digital computers.

Rate this lesson plan

Reviews

The main link you need (Crayola Crayon Colors https://en.wikipedia.org/wiki/List_of_Crayolacrayoncolors) does not exsist.
Thanks for bringing this to our attention. This link has now been updated.
- TryComputing.org team

Add new review

Turing machine
Alan Mathison Turing
Alan Mathison Turing

Did you know that computing has been used in military espionage and has even influenced the outcome of major wars? Alan Mathison Turing designed the code breaking machine that enabled the deciphering of German communications during WWII. As per the words of Winston Churchill, this would remain the single largest contribution to victory. In addition, he laid the groundwork for visionary fields such as automatic computing engines, artificial intelligence and morphogenesis. Despite his influential work in the field of computing, Turing experienced extreme prejudice during his lifetime regarding his sexual orientation. There is no doubt that computers are ubiquitously part of our lives due to the infusion of Turing’s contributions.

Cursor
James Dammann

If you have used a word processor today, moved your mouse on your laptop, dragged an object around on your smartphone, or highlighted a section of text on your tablet, you can thank Jim Dammann. In 1961 during his second year at IBM and just one year after completing his PhD, Jim created the concept of what today we all take for granted -- the cursor. This idea he documented in utilizing the cursor within word processing operations.

After retiring from IBM, Jim went on to inspire future generations of software engineers at Florida Atlantic University. His work there too demonstrated his creativity for he spent considerable effort enhancing their software engineering program by integrating ideas and feedback from local industries into the University curricular. Today, Jim lives in the Westlake Hills west of Austin Texas and spends most of his time in his art studio. He wrote and published The Opaque Decanter, a collection of poems about art, which provided a new view at part of art history.

RISC processor
John Hennessy
John Hennessy

Have you ever wondered how computers can execute complex commands in mere seconds? John Hennessy is a pioneer of reduced instruction set computing (RISC) architecture which employs small, highly-optimized sets of instructions to greatly enhance computer performance. He was instrumental in transferring the technology, specifically MIPS RISC architecture, to industry. He co-founded MIPS Technologies and co-authored the classic textbook with David A. Patterson, on Computer Architecture.

As Stanford faculty he rose to be the Chairman of the Computer Science Department, Dean of the School of Engineering, then Provost and finally the President of Stanford in 2000 (and till date). Hennessy holds a Master’s and Ph.D. in Computer Science from SUNY Stony Brook. He is an IEEE Fellow and was selected to receive the IEEE Medal of Honor in 2012. Hennessey also launched significant activities that helped to foster interdisciplinary research in the biosciences and bioengineering at Stanford.

@ symbol
Ray Tomlinson
Ray Tomlinson

Have you ever considered that someone, at some point, was in a position to choose what symbol would be used separate the user from their location in an email address? That person, it turns out, was Ray Tomlinson, and in 1971 he chose "@". Tomlinson is credited with demonstrating the first email sent between computers on a network, and when asked what inspired him to make this selection he said, “Mostly because it seemed like a neat idea.”

After completing his Master’s degree at MIT in 1965, Ray joined the Information Sciences Division of Bolt Beranek and Newman Inc. of Cambridge, Massachusetts. Since then he has made many notable contributions to the world of network computing. He was a co-developer of the TENEX computer system that was popular in the earliest days of the Internet; he developed the packet radio protocols used in the earliest internetworking experiments; he created the first implementation of TCP; and he was the principle designer of the first workstation attached to the Internet.

First computer mouse
Douglas Engelbart
Douglas Engelbart

In 1967, Douglas Engelbart applied for a patent for an "X-Y position indicator for a display system," which he and his team developed at the Stanford Research Institute (SRI) in Menlo Park, California. The device, a small, wooden box with two metal wheels, was nicknamed a "mouse" because a cable trailing out of the one end resembled a tail.

In addition to the first computer mouse, Engelbart’s team developed computer interface concepts that led to the GUI interface, and were integral to the development of ARPANET--the precursor to today’s Internet. Engelbart received his bachelor’s degree in electrical engineering from Oregon State University in 1948, followed by an MS in 1953 and a Ph.D. in 1955 both from the University of California, Berkeley.

Image credits