Computer full form? This is the most popular question among the people who are new in the field of Computers or recently started studying computers. And Yes it is very important to find what is the actual meaning of the word but to understand the term more accurately, we need to look into History little a bit.
Computer Full Form
In the early centuries, there used to be a major problem among common people and that was how to do calculations? So they had devised different conventional ways like using fingers and sticks to do counting and calculations. It was only good up to small numbers but when it came to the bigger problems then the need of expert was felt and as always there used to exist, such people which were very good with numbers and stuff. Those people were specialists in doing computation tasks and works related to Numbers. So those smart people who did the computation were given the Name "Computers". The word Computer came from the Latin word " Putare" which means both to think and to prune. So it is now clear that a Computer is something that Computes or perform Arithmetically (addition, subtraction, division, and multiplication) and Logical(greater than, less than, equal to, and or, not equal to) Operations. No such thing as a full form of a computer exists officially. You can use Common Operating Machine Purposely Used for Technological and Educational Research as a full form of computer. We have been fooled all the time by some home-cooked long full forms of computer which we had to mug up without understanding the meaning. This article is meant to break the myth among people regarding the full form of the computer because Computer is a term, not an abbreviation given to an electronic machine. As we have understood the real meaning of Computer so let's just have a look at the evolution of computers. Computers as the name define: the machine which computes was only meant to do the calculations and not the things which we do nowadays like listening to music and surfing the internet. In the early days, Abacus was the very famous device used to do calculations but it had limitations in that it could only perform calculations up to a limit. Also, the Pascal triangle was used to do addition but to perform tedious calculations it was not sufficient. According to the Records, Charles Babbage was the father of the computer who designed(in the 19th Century) the first programmable computer called Babbage's Difference Engine which was able to perform tabulation of Polynomial, Logarithmic, and Trigonometric functions. His computer used punch cards to do the calculations. After this, he purposed a new model called Analytical Engine. The Engine was equipped with an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer. Then in the middle of the 19th century, a surge in the field of analog computers was seen, and that Analog computer served till the '30s of the 20th Century. Early digital computers were electromechanical which used electric switches to drive mechanical relays to perform the calculations but those were quite slow. In 1939, German Engineer Konrad Zuse created  Z2 which was an electromechanical relay computer. In 1941, he followed the same model and came up with the World's first programable Electromechanical fully automatic digital computer having a word length of 22 bit which operated on a clock frequency of about 5-10 Hz. The digital computers of that time used Vaccum Tubes, with capacitors fixed in a mechanically rotating drum for memory.  During the time of World War, computers were used to decode the conversations of enemies to prevent attacks. Later on, Von Neuman devised an architecture on which modern computers are based. The Semiconductor Transistors were invented in the late 1940s which led to their use in computers but in the 1950s integrated circuits were discovered which replaced the use of individual transistors because Integrated Chip can contain millions of transistors(thanks to the VLSI-Very Large Scale Integration). The IC lead to the development of the first microprocessor and you will be surprised to know that intel was the company that created the first Processor called intel 4004 which was a 4-bit processor and later on this process continued with intel 8008. Many more companies like Atmel, Motorola, Toshiba, AMD, etc. made processors, and some of them failed but companies like Intel and AMD consistently worked hard to deliver us better products for high-speed computations. Nowadays computers have evolved a lot and can do tedious calculations in a fraction of seconds. Right now Intel has launched 11th Generation processors and AMD has come up with the latest 5000 series. 
Fact: Nowadays we consider those machines as computers which can do Calculations and can be reprogrammed like our personal computers that's why Calculator is not a computer(for it's the inability of being programmed). It means that everything around is a Computer, our Mobile Phone, Microcontroller, Tablet, etc.

We hope that this article explains clearly that Computer is a term and not an abbreviation having full form but some components have full forms and some of them are as :
RAM: Random Access Memory
ROM: Read Only Memory
DDR RAM: Double Data Rate Random Access Memory
IC: Integrated Circuit
CPU: Central Processing Unit(Processor is the CPU)
ALU: Arithmetic and Logic Unit
CU: Control Unit
HDD: Hard Disk Drive
SSD: Solid State Drive(fastest memory right now)
GB: GigaByte
MB: MegaByte
TB: Terra Byte
Mbps: Mega Bits Per Second
Gbps: Giga Bits Per Second
WWW: World Wide Web
HTTP: HyperText Transfer Protocol
HTTPS: HyperText Transfer Protocol Secure
HTML: HyperText Markup Language
XML: eXtensible Markup Language
Wi-Fi: Wireless Fidelity 

That's all. We hope you liked the article.
Thanks for Reading!