One often reads—and Oracle documentation is not an exception—that an object is used to model the real-world objects. Such a view originates from before the days of object-oriented programming. At that time programs had a common or global area to store intermediate results. If not carefully managed, different subroutines and procedures—that's what methods were called then—modified those values, stepping on each other's toes and making it very difficult to trace defects. Naturally, programmers tried to regulate the access to the data and to make the intermediate results accessible only to certain methods. A bundle of methods and the data only they can access started to be known as an object.
Such constructs were ...