News
Market Insights: Should we take notice of ChatGPT and allied AI technologies in MedComms?
Compass Life Sciences’ Medical Affairs Consultant, Kerry Harris recently attended the MedComms Networking #MedComms webinar Should we take notice of ChatGPT and allied AI technologies in MedComms?. In this latest article, Kerry summarises the key takeaways from the webinar and the panel’s predictions for the future with AI in the Medical Communications industry.
The webinar was hosted by Peter Llewlyn, along with guest speakers Martin Delahunty, Katja Martin, Stephen Mott and Avishek Pal.
The focus of the webinar was on the rise of new AI systems, such as Chat GPT, and the impact AI systems could potentially have on medical communications, and the Life Sciences industry in general.
Key takeaways about Chat GPT
Kerry has noted the key takeaways from what was discussed by the industry expert panelists.
- The current Chat GPT is being used as a driver for 4.0 version
- The AI works by creating the text through systematic analysis
- Uses 170 million tokens in this version 4.0 version is expected to have 170 trillion
- Microsoft have recently invested $10 billion into Chat GPT to rival Google’s search engine
- Microsoft are encouraging people to use the feedback tool if references or information is inaccurate, allowing for the AI to learn and adapt.
Current identified limitations with Chat GPT
The prompts put in will affect the outcome, ensuring the correct directions are used e.g., who is the audience?
It should not be assumed that it will deliver good quality information as it would need to be reviewed which would potentially create a role within Med Comms to ascertain if the text is of good quality and accurate.
Currently, the data ChatGPT uses to produce its outcome comes from sources up until October 2021, which limits the information provided and, in some instances could now be outdated.
How Chat GPT could affect the Medical Communications industry
Like with most AI systems, it should be used as a tool/starting block and should not be seen as a way of replacing the human element, it is not a threat it is an opportunity for people to develop their analytical skills.
Within medical writing, it would not be of use in regulatory writing. However, in certain areas of Med Comms, it could be used for Plain Language Summaries (PLS), to help speed up the process. Although, for deliverables such as abstracts, it would have the opposite effect as verification would be needed.
Some of the concerns are that a pharma sponsor may opt to use this technology rather than a medical communications agency. This again comes back to making sure the prompts are correct.
Further information
- Video posted 18 January 2023: Generative AI & ChatGPT for Pharma at https://youtu.be/HxiDMvn9YT0
- Article in Nature, published 18 January 2023, ChatGPT listed as author on research papers: many scientists disapprove at https://www.nature.com/articles/d41586-023-00107-z
- LinkedIn article from Stephen Mott posted 13 January 2023: chatGPT – is a revolution heading this way for medicine and medical information? at https://www.linkedin.com/pulse/chatgpt-revolution-heading-way-medicine-medical-information-mott/
- Article in The Scholarly Kitchen, published 11 January 2023, Thoughts on AI’s Impact on Scholarly Communications? An Interview with ChatGPT at https://scholarlykitchen.sspnet.org/2023/01/11/chatgpt-thoughts-on-ais-impact-on-scholarly-communications/
- Article in The Spectator published 10 January 2023: AI is the end of writing at https://www.spectator.co.uk/article/ai-is-the-end-of-writing/
- Blog from Katja Martin, published 10 December 2022: Should ChatGPT be in every medical communicator’s toolbox? at https://www.medtextpert.com/chatgpt-for-medical-communications/