The tasks associated with each sub-process are also listed alongside. Figure 1. Schematic of the high-level business process of a study, showing the sub-processes and the associated tasks. Tasks automated by the macros are underlined and shown in blue. The matrix of formulation variables to be tested is based on the goal of the study. Based on the formulation matrix, material requirements for protein and the packaging containers are calculated and recorded. Labels are created in Microsoft Word and contain all the relevant information to track and identify the sample protein name, study name, formulation, storage condition, and timepoint.
The protein formulations are prepared manually, filled in storage containers, and labeled. Samples are grouped by storage condition e. This triggers the start of the study time zero. At each timepoint, samples are retrieved from the storage locations, aliquoted, and analyzed on the various instruments as defined in the study. On the HPLC high performance liquid chromatography instruments, samples are placed in an autosampler tray for automatic sequential analysis.
A sequence table is set up that lists the samples to be analyzed including QC Quality Control reference standards. For all other assays, such as pH, concentration, osmolality, and particle counts, samples are analyzed manually one at a time. The data is either stored on the instrument computer or recorded directly in laboratory notebooks.
Of the analytical methods, two are data-intensive. The first is the HPLC assay used to quantify various analytes in the samples, produced as a result of degradation of the protein. Different HPLC methods are used depending on the nature of the degradant analytes to be monitored. The raw data is processed using native software of the instruments to obtain information such as peak areas and relative peak areas for analytes. The results are printed on paper and transcribed manually into spreadsheets for every analyte. Approximately 3—5 analytes are measured for every sample, and therefore, for a study with eight formulations, 48—80 values areas and relative areas are manually entered into spreadsheets per storage condition per timepoint.
Brian David Bissett
The particle counting instrument quantifies sub-visible particles in protein formulations using a light obscuration technique. Typically, differential and cumulative particle counts for seven particle size ranges are measured for every sample.
- Saying Yes to the Millionaire (Mills & Boon Cherish) (A Bride for All Seasons, Book 2).
- Meant to Be Together!
- Ghetto Hot Sauce;
- Basic Cooking (The Concise Collections).
- Shop now and earn 2 points per $1.
Therefore, for a study with eight formulations, numbers are manually entered into spreadsheets per storage condition. Once the data are entered in spreadsheets, the researcher spends additional time sorting them into templates convenient for graphing. For the HPLC assays, further calculations are performed to extract protein recovery based on QC reference standards run during the sequence. Metadata stored in laboratory notebooks are also entered into spreadsheets so as to document all the details of the timepoint electronically.
After a spreadsheet is updated with data from the latest timepoint, the data are graphed to view trends in the degradant profiles over time at each storage condition. The sub-processes, Execute, Analyze , and Generate Reports are repeated for every timepoint in the study. These profiles are then used to assess the protein stability in the various formulations in the study. Trends from early timepoints are used to narrow the range of formulation variables and a follow-on study is set up.
- Brian Bissett (Author of Practical Pharmaceutical Laboratory Automation)!
- Pharmaceutical automation pdf;
- Practical Pharmaceutical Laboratory Automation | Books2Search.
- Practical Pharmaceutical Laboratory Automation;
Typically, a large number of studies are required to be set up so that enough data have been gathered to recommend a formulation that ensures protein stability at the recommended storage temperature over the desired shelf life. An analysis of this workflow revealed that although the data is captured electronically at the source, nearly all the data entry is manual, because it resides in many formats across various applications. An enormous amount of time is spent transferring experimental data from notebooks or paper printouts from instruments to applications for analysis and presentation.
This is a considerable duplication of effort and can lead to errors in data transcription because the same data is reentered several times. Moreover, if and when reanalysis of raw data is required, spreadsheets have to be updated with the new data and this increases processing time further.
For a researcher managing several assays and studies, this can lead to a considerable backlog. Thus, to increase the efficiency and productivity, it is clear that the workflow can benefit from automation of both hardware and software. In this article, we describe the software automation of tasks highlighted in blue in Fig. An analysis of the workflow shown in Figure 1 revealed that some of the key information is defined at the start of the study and resides in the study design and the sampling schedule documents, but the information is re-entered many times in several subprocess. Thus, if these documents are structured as templates, then the information transfer to the various subprocesses can be automated by macros.
Practical Pharmaceutical Laboratory Automation Full Version PDF Book
Therefore, two templates were defined, the StudyDesign. The information in these templates is populated in a standardized format by the user at the planning stage, which can then be accessed by a number of macros at various stages in the workflow. For data import and transfer purposes, two additional Excel workbook templates were also created.
These template workbooks contain a number of predefined worksheets with definitions for i import of data from the Agilent HPLC and the HIAC particle counting instrument and ii for sorting data into two template styles to generate graphs. For the protein and container requirements for a study, we have developed an Excel template with formulas that calculates these requirements based on user input.
Data acquired by the HPLC are processed peak areas are integrated using native software and the results can be printed to electronic report files. On the particle counting instrument HIAC Royco , metadata and raw data for all samples analyzed in a session are stored in two report files:. The data and file formats from a Circular Dichroism spectrometer were also identified for import into Excel. However, because this assay is not a routine stability-indicating assay, details will not be discussed.
The macros were coded by formulation scientists who have a good understanding of the business process and who have skills in writing macros. Therefore, the macros were implemented in a relatively short time. Also, the macros that addressed the most cumbersome step in the business process, namely, data entry and organization were written and implemented first.
The macros evolved to encompass all steps of the business process. Some 20 custom macros were written using Microsoft Visual Basic 6. These macros fall into five broad categories based on functionality i data import ii template transfer iii planning and scheduling iv graphs and v project overview. The visual basic module is composed of a number of subroutines and functions that make the code efficient and modularize the package for easy tracking and debugging.
Laboratory Automation Enclosure | LABS A2/Class II Performance
Several user forms were also written to obtain user input wherever necessary. This help file is also called by the error handling routine and helps users do basic troubleshooting. The design and run time errors in the macros were debugged using standard tools available in Excel. Once the macros were made available to a wider group in the department, it was recommended that the data be spot checked for accuracy.
While the initial macros were completely functional, with increased use, the macros have evolved to become more user friendly based on user input and several new features have been added to the macros package to accommodate preferences in reporting styles of other subgroups. For example, the error bars reported in graphs were calculated differently by different groups. In such cases, several options were made available for error bars calculations.
In every case, when an upgrade was made, regression testing was done to ensure that previous functionality of the macros was not compromised and that new errors were not introduced. A schematic of the workflow for planning and preparation of a study is shown in Figure 2. The corresponding tasks automated by the macros are shown in blue. Templates are used to define the sampling schedule PullDates.
Figure 2. Schematic of the workflow for planning and preparation of a study using the macros. Tasks automated are underlined and shown in blue.
User enters information once in Study Design. This information is used by the macros for the sub-processes. Material Requirements. A spreadsheet template was created to calculate the quantity of protein and packaging containers required based on the formulation matrix, fill volumes, protein concentration, storage conditions, and timepoints entered in the study design. Sample Labels. GenerateLabels Macro1 uses PullDates.
Based on feedback from the users, the labels can be printed down or across the sheet and also sorted by one formulation per sheet. The sequence table is filled by the user at the instrument interface and lists the sample names, their locations in the auto sampler tray, the instrument method, and run conditions. Based on a user-specified start date for the study, the dates for stability analysis are generated from PullDates.
Weekends are highlighted in red to allow the user to modify timepoints if necessary. This information is saved to Calendar. A reminder for the analyst is set a day before the pull date. Data Import into Excel. A schematic of the workflow for analysis of a given timepoint in a study is shown in Figure 3. In the second step, imported data are sorted into two template styles using TemplateTransfer Macro7.usigypufuzur.ga
Pharmaceutical Lab Equipment
Finally, graphs are generated using GenerateGraphs Macro8 to view trends over time. Figure 3. Data for each time point is imported using Macro5 or Macro6 and stored in separate worksheets Timepointsheet in the same workbook. This is sorted into two templates using TemplateTransfer macro.
Macro8 generates graphs of stability over time or temperature using the format in Template Style 2. Details of Data Import. After the raw data is processed, the user prints an electronic report to two. In Excel, the user enters the data location in the Import Definitions1 worksheet yellow cells. For each sample, ChemStationRead imports the metadata from Report This allows the user to rapidly validate the data analysis shown in red in Fig.
The data for every timepoint is compiled in a worksheet and all timepoints for a study are tracked in a single workbook.
- Pharmaceutical automation pdf | RPA ☆ STAR?
- 2 editions of this work;
- Chinas Regulatory State: A New Strategy for Globalization (Cornell Studies in Political Economy).
- La potenza del Nome (la via lattea) (Italian Edition).
Timepointsheet provides a quick overview and check of the data for the recent timepoint, but to facilitate graphing, it is further sorted as described in the next section.