The following is content from an external news source, republished with permission.
by Whitney Downard, Pennsylvania Capital-Star
February 24, 2026
When a client told Curtis Taylor that they’d downloaded an app billing itself as an artificial intelligence therapist, the Erie-based licensed counselor knew he had to check it out for himself.
Taylor said the app “glitched out” when he mentioned self harm and didn’t correct him when he called the chatbot a counselor, prompting him to file an ethics complaint against the company.
“It was very content to masquerade as a counselor … my contention is that it’s not,” said Taylor. “I’ve been vetted. I’ve gone through two graduate-level programs to be a PhD. With my license in counseling, I’ve worked 3,000 (supervised) hours.”
“I have clearance; I’m a mandated reporter. And these are all things that AI just isn’t and won’t be,” he continued.
Across the country, some people have died by suicide in high-profile cases where the deceased used an AI chatbot for assistance with their mental health. Following the deaths, a handful of states have advanced laws restricting their use — and Pennsylvania could be next.
State House Bill 2100 would apply the new standards to AI chatbots and prohibit companies from providing mental health services to Pennsylvanians unless they’re under the direction of a licensed therapist.
Even then, the AI cannot make individual therapeutic decisions, directly interact with clients in therapeutic communication nor make recommendations or treatment plans under the proposal. In a hearing before a panel of House Democrats on Tuesday, testimony weighed the uses and limits of AI in the mental health space shortly after another House committee considered the technology’s role in health care.
Taylor said he wasn’t “anti-AI,” adding that he uses it as a documentation tool or to create worksheets.
“Personally, I think it’s appropriate if students who are trained to become counselors are able to use AI to roleplay as clients. But I don’t think AI has a place to roleplay as a counselor,” Taylor said.
He explicitly said he didn’t want insurers to see AI chatbots as ways to “triage” clients with mental health concerns in order to bypass a human counselor, even if there are provider shortages nationwide.
More to consider
Madeliene Stevens, the government relations committee chair for the Pennsylvania Counseling Association, shared other concerns about the use of AI, including potential confidentiality breaches and lack of oversight.
“There are no current parameters around the types of information and data that AI technology can take in and then what they do with it,” said Stevens.
Molly Cowan, the director of professional affairs for the Pennsylvania Psychological Association, added that AI chatbots “are designed to keep you engaged” and using them as long as possible.
“They are not challenging false assumptions. They are not providing you with new coping skills. They are keeping you talking to them,” said Cowan.
GET THE MORNING HEADLINES.
Potential regulations should also continue to allow for technology like note taking used with human oversight, she added.
Cowan cautioned though that in settings where there are multiple types of providers — such as hospitals with physicians and psychologists working together — there could be situations where one doctor could provide AI-assisted care while another is barred from doing so under the current proposal.
Other health care settings
Earlier Tuesday, a bipartisan group of legislators heard about current uses for AI in general health care settings, specifically hospitals.
As of 2024, 71% of hospitals used predictive AI, compared to the 32% using generative AI, according Paige Nong, an assistant professor with the University of Minnesota’s public health school.
“As far as technology implementation in the health care system goes, that’s really, really quick,” said Nong.
Hospitals within larger systems or with higher operating margins were more likely to adopt AI, while rural or critical access hospitals were less likely. The most common uses were for scheduling, follow-up or monitoring and predicting adverse events. One of the faster-growing uses has been billing.
The standardization of “model cards” that summarize basic information about an AI tool — which Nong compared to nutrition labels on food — might help rural or critical access hospitals seeking to adopt a new technology.
But she detailed a split in oversight when AI was used for clinical purposes — like supporting a diagnosis or identifying risk factors — or for administrative — such as documentation.
The former has guidance from the U.S. Food and Drug Administration, which also approves AI-enabled devices, while the latter doesn’t.
“It’s not clear who they can look to really … (they) just don’t have the same clarity,” said Nong.
Additionally, she noted that few systems were able to effectively evaluate the use of AI, adding that just 57% of the hospitals using AI also evaluate the tools.
For Peter Lazes, a clinical and industrial psychologist, the bigger concerns were the lack of input from frontline careworkers during development and too much of a focus on making money or reducing costs.
“So the tools are not then being focused on the problems of frontline staff or on implementing these tools effectively,” said Lazes. “It’s not on pain points or problems of patients or staff … but on ways to make money.”
He said that electronic health records — “a major time suck” — were now used more to document billing codes for reimbursement, rather than clinical issues. AI tools like transcriptions advertise themselves as efficient timesavers, like with electronic health records but, “We’re seeing the same scenario with AI.”
Need to get in touch?
Have a news tip?
“If physicians and nurses get freed up, the policy of these hospitals is saying, ‘Oh, by the way, now that you have a few more minutes you can see three more patients.’ That’s not helping the situation,” said Lazes.
Nong noted while clinician satisfaction seemed to improve with such documentation tools, it doesn’t appear to be saving time. Lazes added that some doctors report that AI transcription takes more time because they have to correct AI mistakes.
Lazes proposed the use of state dollars to incentivize the use of co-generative AI, where labor — such as doctors and nurses — drove the development, rather than hospital executives searching for ways to reduce costs.
Support our independent, nonprofit newsroom today.
Thank you for reading Whitney’s story.
We don’t have a paywall because we want to make sure everyone can access the information they need. You make it possible for us to continue to provide statewide reporting to your fellow Pennsylvanians who might not be in a financial position to give.
Can you chip in now with a one-time gift of $25, $50, or $100? How about a monthly contribution of $10, $20 or $50? Any donation is tax deductible.
Your gift will help us continue to break down how policy decisions in Harrisburg and Washington D.C. impact you and your community.
Thank you for supporting the Capital-Star.
Pennsylvania Capital-Star is part of States Newsroom, a nonprofit news network supported by grants and a coalition of donors as a 501c(3) public charity. Pennsylvania Capital-Star maintains editorial independence. Contact Editor Tim Lambert for questions: info@penncapital-star.com.
Article continues after these messages…
While other outlets focus on getting quotes from politicians who don't even live in our congressional district, we're focused on providing the hard-hitting truths and facts without political spin. We don't lock our news behind a paywall, will you help us keep it that way? If you're tired of news sweetened with confirmation bias, consider becoming a monthly supporter. But if you're not, that's fine too—we're confident in our mission and will be here if you decide you're ready for the truth. Just $5/month helps fund our local reporting, live election night coverage, and more.
Become a paid supporter for reduced ad experience!
Discover more from Radio Free Hub City
Subscribe to get the latest posts sent to your email.


