Hello all,
I am currently working on a design using an accelerometer to create a tilt sensor. The circuit for the accelerometer worked pretty well with a DC power supply, and it output from 2.25V to 2.75V. Now when I tried to amplify this 0.5V difference to a 10V difference. That means, instead of getting an output ranging from 1.75V to 3.25V, I wanted to get a 0V to 10V. In this case, the first solution came in mind is using a differential amplifier. I tried using LM741 and a bunch of resistors for a voltage divider to generate a reference voltage of 1.75V, with amplification gain of 6.67, since
3.25V - 1.75V = 1.5V
1.5V x 6.67 = 10V approx.
The whole idea worked perfectly simulation. However, when I hooked up every single component on a board, things started to go wrong. The circuit did amplify the sensor output, but only went from about 4V to 10V instead of 0V to 10V. Here is my circuit schematics,


As you can see, the upper and the lower limits of the voltage range from the sensor give you the correct amplified values. However, it just does not work in the real life. I know it is the problem with the differential amplification circuit. Can anybody tell me why and provide me with the right solutions? Thanks in advance!
Comments rated to be Good Answers:
Comments rated to be "almost" Good Answers: