Analysis of Reproducibility and Robustness of a Renal Proximal Tubule Microphysiological System OrganoPlate 3-Lane 40 for in Vitro Studies of Drug Transport and Toxicity
Toxicological Sciences(2023)
Texas A&M Univ
Abstract
Microphysiological systems are an emerging area of in vitro drug development, and their independent evaluation is important for wide adoption and use. The primary goal of this study was to test reproducibility and robustness of a renal proximal tubule microphysiological system, OrganoPlate 3-lane 40, as an in vitro model for drug transport and toxicity studies. This microfluidic model was compared with static multiwell cultures and tested using several human renal proximal tubule epithelial cell (RPTEC) types. The model was characterized in terms of the functional transport for various tubule-specific proteins, epithelial permeability of small molecules (cisplatin, tenofovir, and perfluorooctanoic acid) versus large molecules (fluorescent dextrans, 60-150 kDa), and gene expression response to a nephrotoxic xenobiotic. The advantages offered by OrganoPlate 3-lane 40 as compared with multiwell cultures are the presence of media flow, albeit intermittent, and increased throughput compared with other microfluidic models. However, OrganoPlate 3-lane 40 model appeared to offer only limited (eg, MRP-mediated transport) advantages in terms of either gene expression or functional transport when compared with the multiwell plate culture conditions. Although OrganoPlate 3-lane 40 can be used to study cellular uptake and direct toxic effects of small molecules, it may have limited utility for drug transport studies. Overall, this study offers refined experimental protocols and comprehensive comparative data on the function of RPETCs in traditional multiwell culture and microfluidic OrganoPlate 3-lane 40, information that will be invaluable for the prospective end-users of in vitro models of the human proximal tubule.
MoreTranslated text
Key words
kidney,in vitro models,validation
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined