I washed some apples and served them on a hand-carved soapstone platter (see photo). When we began to eat them, we noticed the apples had left blue stains on the plate. What could have caused this?
In among several perfectly normal sweet potatoes, we came across
this rather oddly coloured specimen (see photo). Although all the others
were orange, only half of this one had complied, the other half being
white. Can anyone explain why this should be and how it happens so
perfectly down the length of the vegetable?
From what i think i understand about the way the eye works, we have cells which can identify either red, blue or green light, which i guess corresponds to a certain wavelength (475nm, 510nm, 650nm), yet yellow light, for example, has a wavelength of 570nm. Is this picked up only partially by red and green receptors? If this is the case, technology using 3 colours of pixels in screens must be perfectly adapted to human eyes, yet a new type of television has been released with a yellow pixel as well. Would that offer any advantage to colour perception, or is it just smoke and mirrors?
whenever I see nasa's space pictures I saw them colourfull but on other things like the video of astroids hittng the planet looks like black and white , why is it so?
I tried things on nasa website, they say bla bla about our eye sight and earth environment and light 7 rays. I don't understand.
I read about the possibility of people having tetrachromacy and would like to know if a test similar to a Ishihara is possible on a standard RGB display.