There are several topics and problems that span multiple sciences, and some that overlap into philosophy. One of these is The Identity Problem. In short, the problem has to do with understanding the identity of a thing. For example, identical twins have many similarities, but in most cases, they at least have different names, and the name is one of the major things used to identify one from the other (perhaps, in addition to some unique physical attribute).
Regarding mathematics, identity is related to problems about whether math is real (a discovery), or an invention. One argument is that math is just an abstraction, and therefore isn’t real. Here’s a thought experiment:
Imagine a penny. It might be new and shiny, or old and dull - it doesn’t really matter. You know what a penny is, and this is definitely a penny. Mathematically, you might say you have one penny. All good, so far.
Now, imagine there is another penny right next to that one. Let’s say the first penny is dated 2004 and is very shiny, and the second one is dated 1998 and is a bit dull. What do you have now? One answer is that you have two pennies. Obviously, one penny plus one penny equals two pennies. Or maybe, you have one 2004 penny, and one 1998 penny.
Let’s replace the pennies with something else. Let’s imagine you have an apple. So, you have one apple. Now imagine you also have an orange. What do you have now? You might say you have one apple and one orange. Or, you might say you have two fruits. Because one fruit plus one fruit equals two fruits.
So, what is happening here? In order to get two fruits, you have to use an abstraction, e.g. fruit instead of apple or orange. But by using an abstraction, you are hiding the identity of the things you have. Just like with the pennies, or the twins. To count them, you must create an abstract definition, and count the items which match that abstract definition.
Counting is a fundamental element of mathematics. Without counting, you can only ever have the numbers zero and one. You either have a thing, or you don’t. In order to have two or more things, you must abstract. And by abstracting, you ignore or hide identities.
So, what is an identity, then? It is the description of a thing such that you can only ever count one of them. Your name might be your identity. But what if your name is John Smith? Then you need additional details in the description to distinguish you from the other John Smiths. Three guys named John Smith requires an abstraction by name. But, John Smith, born June 1, 1982 in Pittsburgh is distinct from John Smith, born November 15th, 1993 in Albuquerque.
What is the identity problem, then? It is the problem that says all mathematics is an abstraction because all things can, with sufficient effort, be uniquely identified. Even atoms or fundamental particles. These can be identified by their properties, which may include descriptions like charge, spin, velocity, and location.
Although many aspects of reality can be explained and understood with mathematics, the explanation will essentially always be incomplete unless it is understood from the perspective of uniqueness and identity.
Let’s go back to our pennies now. If mathematics exists intrinsically as part of the universe, we should expect to occupy a space independent of human cognition. Here’s the thing, though. Human cognition is little more than an abstraction machine. And it works in funny ways. Let’s say you have only one penny. And let’s say it’s the only penny to ever exist. So, you understand that there is now one penny, where before there were zero pennies. This one penny is unique, it has its own intrinsic identity because of its intrinsic uniqueness. But let’s now say you are already familiar with nickels, dimes, and quarters. As soon as you see the penny and realize it is unique, you also recognize that it matches an existing abstraction you have in your head – called a “coin”. A coin is the abstract concept of a smallish, flat cylindrical object, made from metal, and representing a monetary unit. The universe has no built-in, intrinsic concept of a coin. The idea was created by humans. The abstraction was created by humans. Abstraction is the process of recognizing similar traits amongst several objects, and establishing what Plato might have called a “Form”. Plato seemed convinced that Forms represented some intrinsic and true reality. However, Plato didn’t seem to understand that these Ideas, these Forms, are just figments of the machinery of the human mind. It would be like a computer having a cognition that believes binary data and logic are the intrinsic and true reality. That’s because the computer’s cognition is based on machinery that is built on binary data and logical circuits. It understands everything about the world based on that fundamental concept. Humans understand the world based on the fundamental concept of abstractions and Forms. Unfortunately, though that is the way computers and humans understand the world, they are ultimately limited by their machinery. For the computer, the two pennies are understood as both unique and as instances of an abstraction. The computer has a model that represents the shared features of all pennies. These might be articulated as single bits of data. For example, maybe the first bit tells the computer if the Form is animated (e.g. alive) or not. Maybe the second bit tells the computer if the material is metal or not. Maybe the third bit tells the computer if the object is square or cube shaped. Maybe the fourth tells it that the size is smallish. And so on. To the computer, the abstract definition of a penny might look like this: 01010010100110010000010111100010001001. Each 1 or zero is a bit that defines some part of the abstraction, or Form. Explicitly absent from the abstraction of the penny will be the age, specific year inscription, shininess, and other characteristics that can be ignored for all pennies. So, any two pennies, when conceived by the computer will have the same abstraction data: 01010010100110010000010111100010001001.
Unauthorized use: this story is on Amazon without permission from the author. Report any sightings.
Now, let’s say the next 2 bits describe how shiny the penny is, using something like this:
00 – Completely Dull
01 – Mostly Dull
10 – Mostly Shiny
11 – Completely Shiny
Now, let’s say the next 8 bits describe the year on the penny:
1792 – 00000000
1793 – 00000001
1794 – 00000010
1795 – 00000011
…
2046 – 11111101
2047 – 11111110
2048 – 11111111
Thus, the 1998 penny is defined by the binary string: 010100101001100100000101111000100010010111001110
While the 2004 penny is defined by the binary string:
010100101001100100000101111000100010011111010100
For convenience, the matching abstract portion has been bolded, while the unique portion is not. The computer can then easily “conceive” the abstract similarity between these pennies, by examining only the first 37 bits. It can also “conceive” the intrinsic uniqueness of the pennies by examining all 47 bits.
You can probably also now see that for computers, there must be some finite limit of conception, since there is a finite limit of hardware bits available for the computer. At some point, the computer can only process abstractions, because there aren’t enough bits to represent all aspects of all things. Humans as well have a limitation in that their representation is based on the neurons and neural connections in the brain. Fortunately, humans have a vast capacity of neurons and connections between them, but it is still finite.
But what does it mean to have a finite capacity of conception? It means that all possible phenomenon in the universe must be reduceable to some equal or smaller set of concepts. In other words, our understanding of the universe is essentially limited by our ability conceive of it. Anything beyond our ability to conceive will be limited for our understanding. It would be like trying to see color in a B&W photo. The information is not detectable; not available.
Ultimately, then, identity is – perhaps – the optimal configuration of the understanding of uniqueness and the phenomenon of uniqueness within a “thing”. Identity is the unique set of information about an object. Or, rather, it is the unique configuration of the thing in our finite mind. And, then – by definition – any number (perhaps infinite?) of truly unique phenomenon might be ultimately represented in our minds as a single, general case. You might imagine this in the way a 64K color display represents an image versus a display with millions or billions of colors. The original may have trillions of colors – and so either display is ultimately generalizing certain colors. The higher color screen has to generalize less; but ultimately, some generalization can result without any knowledge of how much generalization is occurring. The machinery is limited, and can only report on what it knows and does. A car tire without any sensory capability never knows when it is on pavement or grass. A simple sensor that detects only a surface cannot ascertain the difference between pavement and grass, but can detect that there is a surface. This ultimately implies a limitation in our ability to understand things.
But what is our limitation, really? The cynical view is that we are limited in what we can individually conceive. But that’s sort of like saying that the limits of a computer are a single disk drive. While there may have been some gaps in imagination, there’s never been a limit in the information access of a computer, or even a Turing machine. The Turing machine can work on a tape of arbitrary or, even, infinite, length. The mind, however, can only work on a storage volume of finite size.
In a different way, the definition of “identity” is the single frame of tape in an optimum Turing machine which has finite or infinite tape which can describe the universe. And the first abstraction is itself a unique frame in the tape. And subsequent frames are additional abstractions. Thus, demonstrating, that a pure Turing machine is more capable of understanding the universe than any human.