SHORT REPORT
Use of spectrophotometric techniques to determine the optical density or absorbance for the sample with different dilution factors
NAME & STUDENT ID:
MANMINDER KAUR
COURSE:
Haematology 1
COORDINATOR:
Genia Burchall
PRACTICAL DATE:
26th July, 2012
DUE DATE:
10th August, 2012
ABSTRACT: This experiment aimed at determining the optical density in terms of absorbance of a solution using the different dilution factors to obtain a standard curve where interpretation of Beer’s Law is important to get the results.
In order to frame this aim, the method of dilution factor was used so that absorbance, using the spectrophotometric techniques, could be measured at different concentrations of the solution. Using the micropipette, three different dilutions were prepared in the labeled tubes with coloured liquid and water.
Then the results were interpreted in form of graph and table. Standard curve was drawn that was expected to obey the Beer’s Law. This law states that the concentration and absorbance are directly proportional to each other, which means with the increase in concentration the absorbance also increases and vice-versa. The equation used for Beer’s law is A = ᵋ* l * c, where A is absorbance, l is the path length and c is the concentration.
Finally, the conclusion was made based upon the results. At this instant, the results met did the aim as the straight line was obtained on the graph plotted for absorbance and the dilution values.
INTRODUCTION: The Beer-Lambert law (or Beer's law) is the linear relationship between absorbance and concentration of a solution. The general Beer-Lambert law is usually written as:
A = ᵋ * l * c where A is the measured absorbance, ᵋ is a wavelength-dependent molar absorptivity coefficient, b is the path length, and c is the analyte concentration.
This law is usually helpful when determining an unknown concentration of an analyte. It can be determined by