Digital Calibration of 1.5 bit per stage pipelined ADC
Loading...
Files
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
An analog to digital converter (ADC) is a mixed signal electronic device that converts the
amplitude of input physical quantity i.e. either voltage or current into a digital
representation. This digital representation is in the form of a stream of bits which is easy
to store, can be processed using digital signal processing techniques, and can be
converted back to analog after processing.
Out of various types of ADCs, pipelined ADC is used where high sampling rate
and medium to high resolution is required. Pipelined ADC performs analog to digital
conversion stage by stage, resolving some bits in each stage. All stages in a pipelined
ADC are identical except the last one which is a low resolution flash ADC. Pipelined
ADC becomes imperfect due to imperfections in analog circuit components like capacitor
mismatch, op-amp finite open loop gain, finite unity gain bandwidth etc.
This work focuses on correction of gain error caused due to finite open loop gain
of op-amp. The technique suggested in this work is a foreground digital calibration
technique based on least mean squares (LMS) algorithm. Simulation results prove that
after calibration of 12-bit, 1.5 bit per stage pipelined ADC, differential non linearity
(DNL) improves by 30%. Integral non linearity (INL) reduces from values 60/-60 LSB to
+0.77/-0.77 LSB. Also, signal to noise and distortion ratio (SNDR) and spurious free
dynamic range (SFDR) improve significantly from 35.9193 dB and 36.7348 dB to
75.3619 dB and 82.2884 dB respectively after calibration.
Description
M.E. (Electronics and Communication)
