Benefits
Initially, quantum computers were associated with cryptography. In particular, they could have been used to find the prime factors of a number, a fact that can be used to break the security based on RSA encryption, probably the most popular method of encryption used today. In fact discovery of Shor's algorithm in 1994 which finds such factors using a quantum computer in polynomial time gave a boost to the nascent technology. If quantum computers could indeed factor large numbers quickly, today's internet's security could be rendered obsolete at a stroke.
Nowadays, use of quantum computers for molecular modelling in an area of active research . Modelling even relatively simple molecules on the world's fastest computer make take years. Molecular modelling can help in finding the most efficient catalysts, development of more effective medicines, new materials and drugs and in general disrupt the materials, chemistry, and drug industries. Haber-Bosch process which is industry norm for preparing Ammonia and hence fertilizes is considered to be energy intensive and hence a focus of research.
They can accelerate the development of machine learning which can aid robotics, automation, driverless vehicles, pattern recognition e.g. face and hand writing etc. They can solve computationally intensive optimization problems.
They can help in scientific research e.g. modeling the interior of Black holes; calculate reaction rates, combustion and other effects etc.
However, while quantum computers may solve problems faster than classical computers (the computers we see nowadays), they cannot solve NP complete problems which can be verified in polynomial time but there is no known efficient way to locate a solution in the first place. Hence this class of problems may not be solved in polynomial time either by classical or quantum computers.
Implementation issues
The working of quantum computers is fundamentally different from classical computers. Their basic building blocks are qubits which are superpositions of both both '0' and '1' states and so n qubits actually represent 2^n numbers with varying probabilities. A bit represents either '0' or '1'. Hence increase in number of qubits increases the possibilities exponentially. But qubits must remain entangled, i.e. what happens to one should instantly affect the other, even when they're physically separated. Both superposition and entanglement are quantum effects which are not used in classical computers. As transistors get smaller classical computers need to avoid quantum effect while quantum computers gainfully employ them.
Producing qubits satisfying such conditions has been the major challenge in development of quantum computers. Qubits rapidly tend to lose superposition leading to decoherence (the system becomes entangled with its environment). It happens very quickly, and all that's left are classical states. Shorter coherent times introduce errors in the calculations. However small errors can be corrected through quantum error correction and thus give long calculation times. Challenge is to increase the decoherence time. This usually implies isolating the system from its environment as interactions with the external world cause the system to decohere, reducing lattice vibrations etc.
This technical challenge has resulted in a number of approaches being tried for creating qubits. IBM and Google build their qubits out of superconducting materials. Intel and others are working with qubits fabricated from tiny bits of silicon known as quantum dots. D-Wave uses niobium loops as qubits. One small company, Quantum Circuits is making small machines but trying to network them together hence reducing error rates. Qubits could be photons inside optical cavities or ions trapped and manipulated using lasers or even different modes of light. Many of these methods need temperatures near absolute zero often in millikelvin to operate, basically to reduce decoherence time.
Current status
The number of qubits has increased slowly and even today it is a small 2 digit number. But changes are gaining speed. In 2016 IBM through its Quantum Wxperience program offered free use of its quantum computer consisting of of 5 qubits. And now the same program is offering a 16 qubit quantum computer. Recently IBM announced that it wants to make a 50 qubit quantum computer and it has already created a prototype. Google has the same aspiration. The number 50 has a significance as "quantum supremacy" could be achieved around this number. This refers to the state when computational power of quantum computer cannot be matched anymore by classical machines. But errors will imply that qubits more than 50 are needed to reach there. Other companies e.g. Microsoft, Rigetti etc. are also aiming to improve the number of qubits.
One company D Wave uses a completely different approach called quantum annealing and has produced a 2000 qubit quantum computer. This is not a general purpose quantum computer but more suited for certain applications. However the company has got clients e.g. Lockheed Martin, Google, NASA, Los Alamos National Laboratory etc.
Current status
Quantum computers have now moving beyond theoretical possibility and are entering commercial space. However they remain bulky and expensive. The reason is that most common methods e.g. using superconducting circuits, quantum annealing etc. need help of cryogenics to function. This also has energy implications in terms of maintaining such low temperatures. Hence they contain large cylindrical freezers, wires etc. while processor is a relatively small part of it. Secondly, initially the tasks where quantum computers prevail will be contrived problems set up to be difficult for a classical computer but easy for a quantum one. Besides, quantum computers still use classical sequencing and classical control of the operations. There are lots of areas where quantum algorithms that show any improvement over classical areas do not exist. In particular the end user experience for an average user will not change.
The next decade will see quantum leap in commercial use of quantum computers but the classical computers will continue to remain the dominant form of computing machines. And average end users will continue to identify computing power with the latter.