People can process information from sensation and memory to detect, acquire, select, organize, recognize, identify, categorize, discriminate, and interpret information about organisms, objects, features, times, and locations {perception}|. Perception establishes current environment and organism state and does not initiate or control action or behavior.
Perception acquires information about physical objects and events, using unconscious inductive inference. Senses measure pressure, temperature, concentration, frequency data, sound intensity, light intensity, angle, position, and time. It detects perceptual features and feature relations. It can detect angles and orientations. It can detect separateness and overlap. It can detect bilateral, radial, rotational, and translational symmetries. It can detect straight, curved, rotational, spiral, translational, and oscillatory trajectories. It can detect circular, elliptical, ovoid, heart, diamond, square, rectangle, and triangle shapes. It can detect spatial and temporal relations, such as under, over, near, far, before, and after. It can separate figure and ground, horizontal and vertical, and static and moving [Goldstein and Maiden, 2001].
requirements
Perception requires sensation and does not require awareness. Perception does not require consciousness, subject, or person.
biology
All mammals have perception. Perception can involve amygdala, septum, hypothalamus, insula, and cingulate gyrus.
properties
Perception has limited information capacity.
Initial perceptions can change with further mental processing.
Sense receptors respond to stimuli with sensitivity, accuracy, and precision.
properties: continuity
Perception is continuous, not discrete. Perceptions have no gaps and no overlaps in intensity, time, space, frequency, or quality.
properties: discrimination
Perception can detect differences between stimuli, patterns, or objects, if difference is above threshold. Visual discrimination takes 40 milliseconds to 100 milliseconds.
properties: formal system
Perception is complete and consistent and so is formal system, which can have axioms, statements, and reasoning. Formal properties describe how mind uses sensations to get perceptions.
properties: intensity
Feature values have intensity range. People typically can identify no more than five different intensity levels. The lowest intensity detectable during measured time is one energy unit, such as photon, which causes one chemical reaction. The highest intensity detectable causes physical changes rather than chemical reactions. It can saturate receptors, stretch cell membranes past elastic limit, coat receptors, or damage cells. Intensity accuracy is one to two orders of magnitude poorer than just-noticeable-difference accuracy. People judge intensity relative to other intensities. Sense qualities change with intensity. Isolated sensory signals can only signal that stimulus exists, not define intensity value.
If people judge intensity by ratio {magnitude estimation, perception}, the preferred method, power law relates perceived intensity {subjective magnitude} and stimulus intensity: S = a * I^k, where a is a constant that depends on sense, k is a constant that depends on attribute, I is stimulus intensity, and S is perceived intensity. Exponent k varies from 0.33 for luminance to 3.5 for electric shock. Using logarithms, subjective magnitude to stimulus magnitude equation is: r = a + b * log(s), where r is response magnitude, s is stimulus magnitude, and a and b are constants.
Neuron has refractory period after spike, so spikes have frequency. Frequency is higher if stimulus is greater, until frequency maximizes. Number of spikes per second is also energy flow. If frequency/flow passes threshold, synapse sends signal to next neuron. Higher frequencies send more signals until flow maximizes. Neurons have energy flow, with amplitude, frequency if flow varies, pressure, resistance, and capacitance. Circuits and processes are the same as fluid or electrical flow in pipes and circuits. Variations in flow make perception speckle or vary in density.
properties: intensity fade
If not renewed, inhibition reduces intensity, leaving only empty space.
properties: invariance
Connected lines, topological order, texture, and color do not change with distance or perspective.
properties: labeling
Mind labels intensities, locations, times, and objects and labels links among features and objects {labeled link, mind}. Mind uses labels for learning, memory, and recall.
properties: reference frame
Perceptions seem to be in a stationary world, in which body, head, and eyes move. Fixed reference frame optimizes distance and trajectory calculations and minimizes body, head, and eye deviations from straight-line motion. Fixed reference frame minimizes intensity and distance ratios, allowing perceptual constancies.
Babies develop fixed reference frame as they compensate for motions as they move.
properties: scale
Mind represents sizes and locations at multiple scales. Local signals have high precision, and global signals have low precision [Clarke, 1995].
properties: senses inside and outside
Sight, hearing, touch, taste, and smell detect stimuli outside body {outside sense, perception}. Posture, movement, and pain detect stimuli inside body {inside sense, perception}.
properties: sense interaction
Perception in one sense can affect perception in another sense. Smell and taste affect each other when eating food, in retronasal area. Taste and touch affect each other when eating food. Balance and sight affect each other for head and eye position and to find vertical. Touch and sight affect each other when handling nearby objects. Touch and hearing mechanical vibrations overlap near 20 Hertz.
properties: shortest perception
The shortest perception lasts 120 milliseconds to 130 milliseconds. Visual stimuli lasting less than 120 milliseconds make perceptions that last 120 milliseconds to 130 milliseconds.
properties: simultaneity
Events whose times differ by less than 100 milliseconds seem simultaneous to perception, but not to sensation or neurons.
properties: spatial relations
Spatial relations among object features do not change with changing viewpoint.
properties: subject
Perception is subjective and requires subject. Body-movement, sense-quality, and mental-state covariance defines subject and location, distinguishing it from environment, other organisms, and other minds. Subjective states have different being/reality than objective things [Schreiber, 1973].
properties: synchronicity
Events whose times differ by less than several milliseconds seem to be same event to perception.
properties: timing
Perceptual quality appears 20 milliseconds to 200 milliseconds after stimulus signal reaches brain. During interval between signal and perception, other stimuli can affect lateral inhibition, contrast enhancement, color finding, depth estimation, line orientation, texture analysis, feature detection, iconic memory, short-term memory, and long-term memory [Clifford et al., 2003].
properties: transformations
Object subtends different visual angles at different distances. It can retain same shape as it grows or shrinks in size. It can add or subtract parts or change spatial relations among parts. It can have different textures and lighting. It can have partial occlusion. It can fall on different retina locations. Viewing objects from different positions can change line orientations and angles.
properties: transient
Mind tends to perceive movement or change. However, high attentional load can cause change blindness, repetition blindness, attentional blink, and inattentional blindness.
properties: perception principles
Principle 1: Discontinuous motion between two nearby points indicates boundary. Principle 2: Similar-size surface markings indicate object, especially if other-size markings surround surface. Principle 3: Shallow objects have smooth boundary above deeper objects. Principle 4: Objects are rigid, so curvatures stay constant. Principle 5: Projection laws are true. Principle 6: Oscillating and swinging are in planes. Principle 7: Two surfaces intersect to make concave discontinuities. Principle 8: Minimum-curvature points mark section boundaries.
purpose
Perception evolves to detect behaviorally useful information. Perception models physical reality. Perception improves survival, adaptation, and reproduction. It models reality for these purposes.
Perception does not find true nature of physical world.
factors: awareness
People can be aware that they perceive stimulus. Perhaps, people have experiences when they think about perceptions [Burle and Bonnet, 1997] [Burle and Bonnet, 1999] [Colquhoun, 1971] [Dehaene, 1993] [Efron, 1970] [Fries et al., 2001] [Geissler et al., 1999] [Gho and Varela, 1988] [Harter, 1967] [Hirsch and Sherrick, 1961] [Kristofferson, 1967] [Lichtenstein, 1961] [Makeig et al., 2002] [Pöppel, 1978] [Pöppel and Logothetis, 1986] [Purves et al., 1996] [Quastler, 1956] [Rizzuto et al., 2003] [Rock, 1983] [Sanford, 1971] [Stroud, 1956] [VanRullen and Koch, 2003] [Varela et al., 2001] [Venables, 1960] [Wertheimer, 1912] [White, 1963] [White and Harter, 1969].
factors: consciousness
Conscious processes can modify perceptions.
factors: culture
Fundamental sense qualities can be innate, with no affect from culture, environment, or experience. For example, all cultures have same basic colors, though languages can have rudimentary or sophisticated color vocabulary. Alternatively, different cultures and environments can cause different sense categories. People can learn colors and other sense qualities by perceiving environment and using language. For example, culture affects shape perception, geometric pattern orientation, and shape constancy. Differences in behavior and language indicate differences in perception.
Figures that cause illusions in USA have less effect in cultures in which rectangular objects and arrangements are rare. Horizontal-vertical illusions are stronger for observers living in savanna.
Cultures can describe salt as sour. Cultures can describe sweet, sour, and bitter as tasting like monosodium glutamate salt (MSG).
Ability to interpret relations among items in pictures differs with culture.
Child-rearing style and culture social structure vary with field dependence.
factors: individuality
Receptor and brain differences, and different viewpoints, cause the same physical event to cause different perceptions in different people.
factors: learning
Learning can change later object or event perceptions by changing how perception extracts, values, and links perceptual features.
factors: memory
Memory can change how perception extracts, values, and links perceptual features.
processes: curvature and orientation
To find curvature and orientation at a surface point, measure angles or areas of six equilateral triangles forming a regular hexagon around the point.
processes: deconvolution
If situation has many sources, use convolution and deconvolution to reduce number of source tests. To convolute sources, array cells can receive from more than one source. Some sources cause effects, but most do not. Test cells for effect. Cells with effective source will have responses in all cells in which source is present. Compare results from cells to see which sources are effective. If pattern determined the convolution, deconvolution pattern indicates effective sources. If sources must interact to be effective, pattern shows effective interactions. Brain uses interacting sources to cause effects, so deconvolution can be way that memory and action work. For example, ten feature sensors can feed into 100 cells, with each node receiving from two sensors. Sensors are in two array nodes. If two sensors need to interact to be effective, only the cell with both will have response over threshold. Array can have weights for sensors or node connections. If weights can change, it is like neural net but with starting structure.
processes: declarative knowledge
Perception involves statements about objects and events. Mind can process declarative knowledge both non-consciously and consciously.
processes: distance
To find surface distance, measure surface-orientation and sight-line angle. Angle becomes smaller as objects become more distant. Angle is near perpendicular for nearer objects. At very great distances, brain cannot measure orientation angle accurately. Brain also uses triangulation to find distance.
processes: equilibrium
Perhaps, input disturbs equilibrium, and sense qualities restore equilibrium. New stimuli cause imbalance, then flows associated with sense qualities restore balance.
processes: magnitude
Mind can compare two stimulus intensities by ratio {magnitude, perception}, rather than difference. For small intensity range, next higher magnitude n+1 can be double or triple preceding magnitude n. For large intensity range, next higher magnitude can be ten times more than preceding magnitude. Magnitude judgments require minimum-stimulus zero level but no intensity measurement unit.
processes: motion as cause
Perceptions grow out of objects in motion. At extremes of pain and low frequency, mechanical movements, involving only mass, charge, space, and time, blend with and are identical to sense qualities, in all senses.
Mind can detect mechanical vibration up to twenty cycles per second, the same as lowest detectable sound frequency. Sound detects rapid mechanical vibrations. Mind can feel sound, as well as hear it, at low frequency. Vision blurs succession of frames at twenty cycles per second into continuous motion. Touch and temperature border each other at twenty cycles per second. Below twenty cycles per second, senses perceive mechanical motion, which has and needs no sense qualities. People cannot breathe, flick tongue, or do anything at rates greater than twenty cycles per second. Twenty cycles per second is limiting rate for body mechanical motions.
All sense qualities use sense organ motions. Smell and taste use matter in motion. Food or air texture is always part of taste and smell, as is pressure. Fingers move. Eyes move. Ears cock, or heads turn.
At high intensity, pain is similar for all senses, and all senses are alike at high intensity. High firing rate overcomes all correlations that distinguish senses, and sense qualities become only pain. High intensity can feel like pressure [Smith and Smith, 1962].
processes: movement
Perception laws depend on movement patterns [Smith and Smith, 1962].
processes: prediction
Organisms detect stimulus associations, patterns, laws, and regularities and use them to predict or track events, by analogy or generalization. Mind uses most stimuli to build predictive abilities, rather than to initiate response. Animals habituate to stimuli that have no predictive value [Dodwell, 1970].
processes: object categories
Organisms need to recognize food/prey, dangerous situation/predator, and related organism: mate, child, relative, and self. They can recognize different levels, such as food that is easier to get or more nutritious.
processes: optimization
Perception is like relaxation or optimization, which finds the most-likely pattern.
processes: response to stimulus
All stimulus responses are complex. Response motor output causes internal feedback stimuli by stimulating kinesthetic receptors and external feedback stimuli by changing environment. All responses involve anticipation.
processes: space
Psychological spatial concepts derive from object location, size, and orientation perceptions. Special visual system encodes spatial properties.
Separate visual system encodes object shapes, colors, and textures.
processes: stimulus
Stimuli are structured energy patterns that reach body sensors. All stimuli affect multiple sensors and are complex.
processes: surface perception
Surfaces have extents, locations, orientations, durations, depths, and other perceptual properties. Greatest perceptual-property change rate is at surface boundaries. Surface has temporal and spatial scale, which mind can expand and contract. Mind uses surfaces and surface boundaries to perceive patterns, objects, scenes, and events. Objects or patterns are surface sets. Object sides are convex, concave, or flat surfaces, which have surface textures, such as number of points or bumps.
Perceptual experience corresponds to unique object or event {indestructible simple, perception}, which has name.
Perception can use temporal steps {microgenesis} [Bachmann, 1994] [Bachmann, 2000].
Physical stimuli evoke measurable perceptions, including subjective sense qualities, and psychological changes {psychophysics}.
Propositions about facts require words for classes {referent, perception}, which refer to multiple objects.
Specific receptors detect different skin sense qualities {specificity theory}, such as temperature, vibration, and deformation.
The same mental function can always use same physiology {state-identity theory, perception} {type-identity theory, perception} [Churchland, 1979] [Churchland, 1986] [Churchland, 1988] [Churchland, 1995] [Churchland, 2002] [Farber and Churchland, 1995] [Pylyshyn, 1984].
Mind uses sense information and learned, rapid, ad hoc rules to perform appropriate action {utilitarian theory of perception} [Ramachandran and Blakeslee, 1998].
Perception detects distance, angle, size, shape, speed, brightness, hue, lightness, loudness, pitch, attack, decay, pressure, temperature, texture, taste, and smell {perceptual feature}. Perhaps, sense has 100,000 independent features. Features are frequent and regular, so people soon memorize them in all possible states and combinations.
processes
Mind derives features from local spatial and temporal relations among intensities in sense information channels. Features are about stimulus intensity, location, time, frequency, and quality and about higher-level stimulus combinations.
processes: association
Mind associates two features if they are simultaneous, at rate higher than chance.
processes: feature analysis
Perception distinguishes and links features, values, and probabilities. Perception excitations and inhibitions depend on reinforcement pathways and change feature probabilities. Feature analysis {feature analysis} works for independent variables with discrete values but not for clustered variables or continuous values.
properties: continuity
Feature values are always continuous, with no discreteness, edges, jumps, or skips, though neuron signals are discrete. Feature values are continuous even for neurons far apart and for small intensities. Movement, blinking, and other transformations never cause feature value to be discrete.
Perhaps, continuity results from insensitivity to change. Perhaps, coordinate units are larger than feature sizes. Perhaps, information channel and signal number are large, so graininess is small. Perhaps, continuity results from integration, over multiple information channels, of overlapping regions of different sizes, displacements, and orientations.
properties: discontinuity
Missing features cause discontinuities in events.
properties: probability
Features have probabilities of happening if another feature happens.
effects
Recognized features and feature combinations cause actions. Unrecognized features suppress actions [Werner, 1974].
Perceptual features {accidental feature}| or regularities, such as aligned edges and reflected colors, can result from viewing position {accidental viewpoint}.
Perceptual features {non-accidental feature}| {non-accidental property}, such as mass, can not depend on observation point. Non-accidental features stay constant from multiple viewpoints and under transformation, reflection, rotation, translation, and zooming. Relative feature positions stay the same.
recognition
Memory uses non-accidental properties and relative positions to make object templates for perceptual recognition.
projection
Straight edge tends to project collinear lines. Curved edge tends to project lines that fall along smooth function. Parallel edges tend to project parallel lines. Edge intersections tend to project lines that meet at point. Parts that are close together tend to project lines that are close together. Symmetrical parts tend to project symmetrical line patterns.
Two stimuli can mutually inhibit {masking, perception}| [Bachmann, 1994] [Bachmann, 2000] [Breitmeyer, 1984] [Breitmeyer and Ögmen, 2000] [Dehaene et al., 2001] [Dennett, 1991] [Enns and DiLollo, 2000] [Flanagan, 1991] [Flanagan, 1992] [Flanagan, 2002] [Keysers and Perrett, 2002] [Keysers et al., 2001] [Macknik and Livingstone, 1998] [Macknik et al., 2000] [Rolls and Tovee, 1994] [Thompson and Schall, 1999] [Thompson and Schall, 2000] [VanRullen and Koch, 2003].
If, after a several-millisecond stimulus, a second stimulus is at the same location, people do not perceive first stimulus {backward masking}. Masking is greatest when second stimulus is 70 milliseconds to 90 milliseconds after first. Second stimulus has no affect after 100 milliseconds. For sound stimulus, second stimulus is noise. If first stimulus causes emotion, emotion results even if stimulus is not conscious.
Masks can precede targets {forward masking}.
Masks can be simultaneous with targets, or masking stimulus can follow short stimulus {metacontrast masking}, to improve detection.
People skew their reports about perceptions {perceptual bias}, for example, when experimenters ask people to report intensity using number scale.
People's responses can have symmetric distribution around central value {centering bias}. Avoid this bias by using response-scale numbers equally.
People's responses can use shortcuts based on associations {contraction bias}. Avoid this bias by reversing stimuli and responses.
People can respond to high and low stimuli differently {logarithmic response bias}. Avoid this bias by using number range with only single digits, with no ratios.
Smaller stimulus ranges have steeper slopes {range equalizing bias}. This bias is unavoidable, so people underestimate large sizes and differences and overestimate small sizes and differences.
People's responses can group stimuli by time {stimulus frequency bias}. Avoid this bias by presenting all stimuli equally often.
People's responses can group stimuli in space {stimulus spacing bias}. Avoid this bias by spacing stimuli at subjectively equal intervals.
Previous conditions can influence later performance {transfer bias}. Avoid this bias by using separate groups of uninitiated people for each investigation or judgment.
Stimulus intensity, location, size, form, number, and duration {amodal feature} do not depend on sense, can transfer among senses, and allow equivalence judgments among sense modes {amodal perception}|. Perhaps, they allow desire, expectation, or pain judgments.
At space locations, mind can average sense qualities with neighbors {blur}|. Sampling error, background noise, and sense organ imperfections, such as imperfect lenses, retina veins, and dust, cause blur.
People perceive continuously varying intensity or frequency as discrete ranges {categorical perception}. Sense processing divides continuous range into intervals and so discrete categories [Damper and Harnad, 2000] [Harnad, 1987]. For example, people perceive pitch as tones and half tones. People perceive tone durations as eighth notes, quarter notes, half notes, and whole notes.
Categorical perception detects musical intervals, animals, faces, and face expressions. People identify and label perceptual features and feature combinations with sharp boundaries, using many dimensions. Labeling/identifying and discrimination are two aspects of one mechanism.
Smell perception diminishes during continuous exposure {cognitive habituation}, though receptor and neuron sensitivity do not change.
People can correlate intensities in two different senses {cross-modality matching}. For example, mind can match taste intensity with pain intensity. The cross-modality matching technique can test sensitivity to stimuli.
Stimuli {cue, perception}| before other stimuli can indicate second-stimulus types, times, or locations. Cues have maximum effectiveness 150 ms before second stimulus. Cues can provide correct information {valid cue}, incorrect information {invalid cue}, or no useful information {neutral cue}.
People can have feeling that they have seen or heard something before {déja vu}|. For déja vu, people typically have fatigue, have experienced component features before, are young, and have heightened sensitivity.
Different contexts can make the same object or event signal differ, because context expands or contracts at varying rates {domain warping}. Transforming space and structure representations can find geometric analogies and trajectories.
People can have false recognition {fause reconnaissance}.
Mind places new features at locations that have easily described relationships to already represented features {frontier effect}.
Objects have properties, and features have properties. In most perception, object properties override feature properties {global superiority effect}.
Novel unrelated stimuli can aid pattern, poetry, idea, and hypothesis recognition {lateral thinking}. Random words are examples.
Experience thresholds {limen} are not constant.
Observer can assign or choose natural number to estimate stimulus intensity magnitude {magnitude estimation, observer}, such as 1 to 10 for minimum to maximum.
Mind can perceive two different stimuli {metamer} as the same. For example, two different wavelength and intensity combinations can result in same color. Objects with different surface reflectances can cause same color perceptions.
Perception can distinguish event order {order discrimination}, though events seem to be simultaneous in sensations.
Perhaps, perception is continual every 20 to 200 milliseconds {perceptual moment} {frame, perception} {snapshot, perception}. Perception is not continuous [Burle and Bonnet, 1997] [Burle and Bonnet, 1999] [Colquhoun, 1971] [Dehaene, 1993] [Efron, 1970] [Fries et al., 2001] [Geissler et al., 1999] [Gho and Varela, 1988] [Harter, 1967] [Hirsch and Sherrick, 1961] [Kristofferson, 1967] [Lichtenstein, 1961] [Makeig et al., 2002] [Pöppel, 1978] [Pöppel and Logothetis, 1986] [Purves et al., 1996] [Quastler, 1956] [Rizzuto et al., 2003] [Rock, 1983] [Sanford, 1971] [Stroud, 1956] [VanRullen and Koch, 2003] [Varela et al., 2001] [Venables, 1960] [Wertheimer, 1912] [White, 1963] [White and Harter, 1969].
Perception can alternate between two interpretations {rivalry, perception}|, though stimulus pattern stays the same.
Stimulus can have higher intensity than neighboring stimuli {saliency, perception}|. Saliency originates in dorsomedial pulvinar, lateral-intraparietal lobe, and frontal lobe. Saliency can affect thalamus or sensory cortex [Blaser et al., 1999] [Braun and Julesz, 1998] [Braun and Sagi, 1990] [Braun et al., 2001] [Itti et al., 1998] [Itti and Koch, 2000] [Itti and Koch, 2001] [Jovicich et al., 2001] [Koch and Ullman, 1985] [Nakayama and Mackeben, 1989] [Parasuraman, 1998] [Pashler, 1998] [Treisman, 1988] [Treisman and Gelade, 1980] [Walther et al., 2002] [Wolfe, 1994] [Wolfe, 1999].
Animals can become ready for specific stimuli or general stimulation {sensitization, perception}. In harsh environments, animals sensitize to all stimuli. In favorable environments, animals become responsive to stimuli about desirable goals. Rearing animals in restricted environments, in which they see only vertical stripes, results in cortical neurons more sensitive to vertical orientations.
External stimuli of which people are unaware, just below conscious {subliminal perception}|, can affect perception and memory. All sensory modes have subliminal range [Dixon, 1971] [Merikle and Daneman, 1998]. Subliminal stimulus in "blind" eye transfers perception to that side.
Sense-organ stimuli can cause different sense qualities {synesthesia, perception}|.
colors
White numeral symbol can have color. However, Roman numeral can have no color when the Arabic numeral of same number has color, indicating that synesthesia is for perception, not abstract concept.
Perhaps, connections between brain areas V4 and area in V8 {number-grapheme area}, which are adjacent in fusiform gyrus, cause color-numeral synesthesia.
Months and weekdays can have colors. Perhaps, connections between TPO and angular gyrus areas cause day-month and numeral synesthesia.
tastes
Shapes can have associated tastes. Insula is for tastes and is close to sensory hand area.
sounds
Shapes can have associated sounds. Low sounds can associate with dark colors, and high sounds with light colors [Cytowic, 1989] [Cytowic, 1993] [Cytowic, 2002] [Galton, 1997] [Grossenbacher and Lovelace, 2001] [Nunn et al., 2002] [Paulesu et al., 1995] [Ramachandran and Hubbard, 2001] [Ramachandran and Hubbard, 2003] [Ramachandran, 2004] [Stein and Meredith, 1993] [Stein et al., 2001].
factors: age
Infants have cortex and thalamus auditory-visual brain connections.
factors: brain
Brain regions can activate each other through atypical stimulation pathways or through inhibition pathway loss.
factors: drugs
Drugs can cause stimuli in sense organs to result in another sense's sense qualities.
properties: occurrence
Less than 0.5% of people have synesthesia, usually as just black and white colors. Such synesthesia is involuntary, starts early in life, and lasts a lifetime. It is hereditary. It happens more in lefthandedness, more in females, more with better memory, more with bad math and spatial ability, and more in creative people [Baron-Cohen and Harrison, 1997].
properties: unconscious
Synesthesia is not under conscious control.
Mind stores sensory signals and features temporarily {temporal buffering} while receiving signal remainder. Natural stimuli are not stationary, do not move regularly, and require time interval.
If vision analyzes spatial relations differently than hearing or kinesthesia, vision overrides hearing or kinesthesia {visual dominance}. For example, sound direction depends on direction of visual object associated with sound [Ingle et al., 1982]. If kinesthesia or touch analyzes spatial relations differently than hearing, kinesthesia or touch overrides hearing. Taste and smell have little effect on spatial relations.
Sense qualities have physical measurements {primary property} about space and time, such as object size, shape, motion, number, solidity, hardness, mass, and extension.
Sense qualities have mental measurements {secondary property}, such as color, touch, aroma, taste, timbre, and sound. Secondary qualities do not derive from primary qualities but come from stored knowledge and assumptions. They affect even simplest perceptions.
Stimuli are energy patterns representing information about objects in environment or body {representation, perception} {internal model}. Mental representation organizes perception and guides behavior. Objects have more than one representation and template, using different viewpoints and/or viewer-centered or absolute coordinates. Mind represents objects by shape, size, orientation, and feature and part relations. Mind stores features and relations in flexible templates.
processes
Parallel and serial information flows convey data for object and event variables. Sense receptors transform energy and code useful information parts. Nervous systems process and store object-and-event information. Information directs muscle and gland actions.
processes: object properties
Mind represents object by shape, size, orientation, and feature and part relations. Visual system encodes object properties of shape, color, and texture. A separate and independent visual system encodes spatial properties of location, size, and orientation.
processes: arrays
Representation can be two-dimensional array coding variable intensities. For example, colorful scenes can be like television screens, with intensity levels for red, green, and blue phosphors at screen points. However, representations cannot be point-for-point copies of visual images, because sense receptors communicate laterally, so points include data from surrounding points.
Representation can be two-dimensional array coding interference pattern intensities. For example, black-and-white scenes can be like holograms, with intensity levels at points determined by illumination phase from all scene points. However, representations are unlikely to be holograms, because mind does not use phase information at sense receptors and mind has no reference beam with which to reconstitute holographic images.
Representations can be sets of two-dimensional arrays, each coding one variable. For example, parallel information pathways can code for red, green, and blue intensities and combine the three later to give net color. Representation uses different topographic maps to code for location, orientation, depth, color, shape, motion, and time. After initial processing for variables, outputs cross-correlate to integrate information.
Representation can be information packets routed through mental networks. Like Internet, mind can divide information into data blocks sent over alternative pathways to destinations, where mind reunites them. Representations probably use information packets, tagged with relations to other packets and timed to synchronize with other packets.
processes: hierarchy
Pre-representations code for intensities and have no meaning. Representations combine innate and remembered information with sense information. Further processing makes semantic object and event relations, and gist allows thoughts and goals. Mind has representation-type hierarchies [Booth and Rolls, 1998] [Posner and Raichle, 1997] [Posner, 1978] [Posner, 1989].
processes: proposition
Mind uses propositions to represent images and describe shapes, without using size or orientation. Mind manipulates images by logical operations on propositions. Propositions can have variables.
coding
Space-time pattern representations use neuron signal-intensity functions. Visual system samples scene at various spatial positions, in sequences based on experience, to derive curvatures, surfaces, textures, reflectances, colors, orientations, eye positions, head position, hand positions, constancies, and co-variances.
coding: analog and digital
Depolarization impulse cycles require one millisecond, so each millisecond axon has depolarization or not. Axons carry OFF/ON signals and so are digital.
At synapses, variations in neurotransmitter vesicle size and release time make analog flow through receptors.
Neural processing has advantages and avoids disadvantages of analog and digital coding. OFF-signals set a steady baseline. ON-signals have equal strengths. Neuron coding has no timing, is not stepwise, and is not linear. Axon coding depends on impulse rate or flow.
coding: codes
Simple code can use neuron average firing frequency. Another simple code can modulate firing frequency, as in FM radio, in which fundamental frequency is like carrier wave. In temporal code, steady frequency is like clock, and frequency changes carry information. Code can superimpose frequencies to make beat frequencies. Code can superimpose frequencies, so axons carry composite signals, and different receptors use different components. Code can be waveforms of frequency sets.
spatial coordinates
Minds use three-dimensional spatial coordinates to navigate, to encode spatial information into memory, to transform images, and to specify feature locations, sizes, and orientations. Mind represents image by specifying intensities at locations in space array. Space array can show object parts, relations, and spatial axes. Representation spatial and temporal relations correspond to actual relations. Mind can manipulate size and orientation.
Minds convert categorical relations to spatial coordinates, and vice versa, to link size, distance, orientation, front/back, and reference frame to classification.
spatial coordinates: types
Mind can use locations relative to retina {retinotopic coordinate} or relative to spatial reference point {spatiotopic coordinate}. Spatiotopic coordinates can be relative to body {body-centered coordinate} {egocentric coordinate} or to another object {allocentric coordinate}. Body-centered coordinates can relate to head {craniotopic}. Mind plans and performs behavior using egocentric coordinates, compensating for body movements. Body movement coordination requires only egocentric space, not images. Egocentric space can transform to conceptual space representations.
Allocentric representations can transform to egocentric representations. Allocentric coordinates can be specific to view {viewer-centered} or object itself {object-centered}.
Mind can specify location in Cartesian coordinates, along X, Y, and Z dimensions from origin, or polar coordinates, by radius and planar and depth angles from origin.
Processes that guide action need coordinates {implicit coordinates}. Processes that store representations need coordinates {explicit coordinates}.
Local coordinates specify part locations, using many separate origins to form interlocking coordinate system. Global coordinates specify part locations relative to one origin.
Topographic maps compute locations in nearby space using body-based coordinates. Topographic maps compute locations in far space using allocentric coordinates.
Not all directions relative to body are equally accessible in image.
Subjects do not image themselves in centers of three-dimensional scenes.
People use viewer-centered coordinates in imagery.
image
Image is private symbol system that specifies local object geometry using categories and coordinates. Mind makes general, specific, and autobiographical images. Unlike perceptions, image has interpreted perceptual units, orients in space, aids event recall, and solves problems. Mind can remember images. Mind cannot readily manipulate image. Image is not picture in the head.
image: scanning
Mind can scan image, and scanning time increases linearly with distance.
image: information
Image information depends on element number and arrangement. Image does not have as much detail as physical object.
image: cue
Eye position can cue access to next image in sequence.
image: spatial frequency
Images do not include perception fundamental spatial frequency.
image: brain
In brain, membrane electrochemical signals can alter molecules that eventually produce enzyme sequences, forming patterns. Enzyme patterns can affect nearby cells or affect transferred nerve signals, making effect cascades. Both hemispheres can generate images.
Brain-damaged patients that cannot recognize faces report that they also cannot image faces. Patients that cannot distinguish colors after suffering brain damage also cannot form mental images that include color. Patients with hemi-neglect cannot see mental-image or dream right or left half.
image: time
Mind requires 100 milliseconds to 200 milliseconds of light to see image. Processing image takes 60 milliseconds to 70 milliseconds [Shiekh, 1983].
image: network
Images have parts and relations. Images are networks of nodes on surfaces and connections of spatial relations. If nodes and connections are abstract-space dimensions, patterns are abstract-space points. Similar images are near each other in state space.
Mind can store center/surround information at several, separated locations {primal image representation}, rather than high-level feature sets.
Internal interaction, past reaction, experience, goal, framework, available information, and action representations {schema, perception}| can direct perceptual exploration {anticipatory schema}. Mind updates integrated prior-movement representations {postural schema} after position changes. Schemas are rule groups. Rules generalize inference patterns. Schemas are for concept formation [Schank and Abelson, 1977] [Schank, 1997].
Animals navigate environment using map with reference point {centroid} and gradient {slope, gradient} {slope-centroid model}. Mind can calculate direction and distance to target by triangulation using centroid and slope.
Observer can change stimulus intensity or frequency until it matches, or differs from, reference stimulus, or until observer does or does not perceive it {method of adjustment} {adjustment method}. It is a method of limits.
Observer experiences many stimuli covering whole stimulus range and reports perception or no perception {method of constant stimuli} {constant stimuli method}. Alternatively, observer compares many stimuli covering whole stimulus range to reference stimulus and reports if they differ from reference or not.
Observer can attend to sound in one ear while distracted by sound in other ear {dichotic listening technique}. Alternatively, observer can attend to voice while another voice speaks.
After seeing stimulus {target, perception}, observer can respond if he or she sees stimulus among distracting stimuli {n-alternative forced-choice} {forced-choice}.
Starting from no difference in frequency and/or intensity and increasing difference, observer can note when there is difference {method of limits} {limits method}. Alternatively, starting from big difference and decreasing difference, observer can note when there is no difference.
Observers can respond after detecting stimulus on right or left {position discrimination}.
Observers can attend to text lines in one color and ignore lines in another color {selective reading paradigm}.
Observer can respond after detecting stimulus {stimulus detection}.
Observer can respond if stimulus is word or not {word discrimination}.
Outline of Knowledge Database Home Page
Description of Outline of Knowledge Database
Date Modified: 2022.0225