Loading clinical trials...
Loading clinical trials...
Showing 1-5 of 5 trials
NCT07286591
The second-trimester morphology ultrasound is a key examination in obstetric monitoring that aims to assess fetal growth, identify any structural abnormalities, and inspect anexes such as placenta, umbilical cord, cervix,... Several studies suggest that a significant proportion of fetal malformations can be detected during this time frame if a complete morphological analysis is performed. However, the reliability of the screening depends on the quality of the equipment, the operator's level of expertise, and adherence to protocols that define the necessary scans. In France, since the first reports of the National Technical Committee on Prenatal Screening Ultrasound (2005), particular attention has been paid to standardizing practices. More recently, the French National Conference on Obstetric and Fetal Ultrasound (CNEOF) published new recommendations (2022, revised in 2023) including the development of reference silhouettes for the second-trimester examination, proposing 26 views (22 required and 4 additional). However, the CNEOF does not formalize quality criteria for evaluating the conformity of these images; this task has been taken over by the French College of Fetal Ultrasound (CFEF), which has established a scoring and validation grid for each fetal slice (see CFEF 2022 document). In parallel, artificial intelligence (AI) is gradually becoming established as a decision support and automation tool in medical imaging, particularly in ultrasound. Deep learning algorithms are capable of identifying anatomical structures, positioning measurement markers, and selecting the most optimal slice, reducing inter-operator variability and streamlining workflow. In the field of obstetric ultrasound, some companies have launched systems capable of detecting or annotating fetal structures in real time, potentially improving diagnostic reliability and reproducibility. Samsung has developed a system called Live View Assist, available on its latest generation ultrasound scanners, which uses AI to automatically recognize and freeze the required fetal slices in real time. The tool also offers automated validation: if the detected slice conforms to the expected standards, it is directly checked off on a checklist. This innovation promises time savings, a reduced risk of missing certain complex slices, and improved standardization. However, there is little data, particularly in France, regarding to the actual performance of this tool in a routine screening context. Before considering the integration of Live View Assist and AI into daily practice, it is therefore essential to evaluate the quality of the images it acquires, the feasibility of a complete examination assisted by AI, as well as the potential impact on examination time and improvement of the workload for sonographers. The aim of this study is to evaluate whether the quality of the 20 mandatory images automatically validated by Live View Assist is not inferior to that of the 20 mandatory images acquired and validated manually by an ultrasound technician, according to the CFEF quality criteria based on the silhouettes recommended by the CNEOF.
NCT07284550
Heart valve diseases are among the most serious cardiovascular conditions in older age. One of the most common forms is aortic valve stenosis, a narrowing of the valve opening between the left ventricle and the main artery. As the valve becomes tighter, the heart must work harder and harder to pump blood through the body. This process often develops slowly over many years and initially causes no clear symptoms. As a result, the condition is frequently detected only in advanced stages, when warning signs such as shortness of breath, chest pain, or dizziness appear. Without treatment, aortic valve stenosis can become life-threatening. If detected early, however, very effective treatment options are available today. Up to now, the disease has been reliably diagnosed mainly through echocardiography. Yet this method is complex, costly, and requires specialized medical staff. A simple, affordable, and broadly accessible screening option does not yet exist. The interdisciplinary clinical research project explores whether conventional smartphones could fill this gap. Almost all modern devices are equipped with sensors such as microphones, accelerometers, and gyroscopes. These can capture both heart sounds and subtle vibrations of the chest. The research team is investigating whether reliable diagnostic information for the diagnosis of aortic valve stenosis can be extracted from such recordings. To achieve this, the signals are processed with newly developed methods and analyzed using artificial intelligence. For the study, several hundred patients with and without valve disease will be examined. The smartphone results will be compared with established diagnostic standards, particularly echocardiography, to test accuracy and reliability. If successful, the approach could enable a straightforward, digital heart check at home using nothing more than a conventional smartphone. Such a tool would provide an accessible, low-cost, and widely available method for early detection, helping more people receive timely and potentially life-saving treatment.
NCT07083791
This study seeks to validate the real-world accuracy of an AI-based algorithm for identifying the location of an accessory pathway from the 12-lead electrocardiogram
NCT07021781
Mobile applications and artificial intelligence are increasingly integrated into medical practice, yet their impact on workflow optimization and diagnostic accuracy remains understudied. This study evaluates the effectiveness of the MedQuest mobile application in optimizing patient questionnaire processes and assesses the accuracy of AI-driven International Classification of Functioning, Disability and Health (ICF) coding in comparison to traditional clinician-based coding.
NCT06992908
This study evaluated a specially designed AI model developed by a programmer using x-ray readings and corresponding treatment decisions from 70% of the cases (either orthodontic treatment only or orthodontic treatment with surgery). For the evaluation, we will use the remaining 30% of cases. "Subsequently, to assess its performance, the model was tested on the remaining 30% of cases. The programmer provided only the X-ray readings as input. The AI model was then tasked with classifying.