These neurosciences thus produced a flexible barrier between the realms of stimulus, the form of the data, the organs of reception, and the site of processing. While such subjective perception had been found in nineteenth-century physiology and psychology, it was now no longer a problem for scientific objectivity and knowledge, and was positively embraced for technological potential in neural nets. The very nerves, extracted from any particular body, are capable of processing and analyzing data.
I would even argue that ontology and epistemology were both collapsed into another approach, which focused on method, process, and feedback. The act of processing information and the act of analyzing it became the same, and the possibility emerged that this de-contextualized seeing process could be rebuilt in other locations. This is not an insignificant experiment in the histories of visuality.
The cybernetic model of perception desired a purely technical and autonomous eye. If one wished to see an insect, then one built a frog’s eye; if one wants to see a missile silo, perhaps one builds a different form. Vision circulated. There was no single norm for vision. The ideal of a singular, or objective, form of vision was replaced by a fantasy of effectiveness serving particular functions. Historically I wish to focus on the critical function that the lack of concern for static ontologies played in facilitating a shift in the conception of sense perception as an interactive process and a material technology in design, cognitive science, and cybernetics. This was an eye extended into the body and out into the world, a vision that was material and could now act on its own—flies eaten and airplanes blown up, for example—a networked cognition beyond the brain and a new way to understand the differences between subjects and objects. There was no ontological stability in cybernetic visuality; there were no stable enemies or preys. 
There was, however, a curious indexical and temporal nature to this ability to materialize vision. To focus on how eyes “speak” to the brain demanded a lack of regard for, or perhaps an automation of, recording and an assumption of an informatically dense world. The impossibility of ever accessing and processing all this data was no longer the problem. Instead the question became how to manage and utilize the unknown. This subtle but important revision of attitudes to knowledge and objectivity was first articulated in McCulloch’s classic piece, written with Pitts in 1943, establishing the equivalence between neurons and Turing machines and conceiving of a “neural net.” As I explained in chapter 3, McCulloch ends the piece with an astonishing statement con- cerning scientific claims: “thus [this research proves] that our knowledge of the world, including ourselves, is incomplete as to space and indefinite as to time. This ignorance, implicit in all our brains, is the counterpart of the abstraction which renders our knowledge useful.”
This “ignorance” or subjective quality of all cognition was now the “abstraction” that produced “use.” Subjective perception was equated with technological potential without concern for mediation, and efficacy replaced the concept of an absolute reality as the measure of truth. McCulloch not only took a non-Cartesian perspective but also resolutely declared any split between the mind and the body, or reality and cognition, both undesirable and impossible. 
Notes on CHAPTER 4. Governing
16. A lot of work was done at the time on scanning, flickering, and other significant visual phenomena. For example, the work of Frank Rosenblatt under the influence of McCulloch on machine vision, cognition, and scanning had an impact on the conception and design of perception and cognition in electronics and machine intelligence and in developing scanning technologies. Rosenblatt, “Per- ceptron,” of 1958 anticipates Lettvin et al., “What the Frog’s Eye Tells the Frog’s Brain,” of 1959, and both articles anticipated future work in contemporary ma- chine learning and vision models where symbolic processing and neural nets are now returned to use, even though at the time ideas of perceptrons were later debunked in 1969 by Marvin Minsky and Symour Papert in a book titled Perceptrons, in an ongoing set of debates over the nature and approach to machine intelligence. 17 McCulloch, Embodiments of Mind, 34. (p. 310-311)
17. McCulloch, Embodiments of Mind, 34.
18. McCulloch elaborated on these concepts in many of his lectures, including those preceding the actual conduct of this experiment; “Physiology of Thinking and Perception.”