Apparently the secret to understanding computers is to understand that they are meaningless.
A computer doesn't know what it is doing. It doesn't 'know' anything. Everything it does is all just a flow of voltages and the absence of voltages. The binary flow of 1s and 0s, empty of inherent meaning. Humans have to attach meaning to that flow.
(Richard Butterworth, Lecture 2: Digital Representation)
Different applications process the binary data in different ways. So we have to know which application a particular file is meant to be processed by. This is why data formats are necessary. Files should be opened with the application that they were intended for or a fully compatible application.
For example a file that was created and saved in Microsoft Word won't work properly if you try to open it in Notepad. Word will have saved the file as a doc file. Notepad won't be able to process the metadata that's intended to tell Word how to set out the work.
Clearly the boundaries between applications aren't as rigid as this suggests, or as some companies might like. The same data can be moved from one data format to another. Different applications will do different things with the same data.
It all comes down to that binary flow of something and nothing, 1s and 0s.