The Latest | Page 7

side-view-portrait-of-bearded-gentleman-lying-in-bed-young-w-230294980-stockpack-adobe_stock
Side view portrait of bearded gentleman lying in bed. Young woman in white lab coat on blurred background

Washington Bill to Allow Non-MD-Prescribed Assisted Suicide and to Shorten Waiting Period

I previously wrote about pending Oregon and Vermont legislation to do away with the requirement that only doctors be allowed to legally assist suicides. Now, it's Washington's turn, with a proposal to allow "qualified medical providers" to prescribe poison, defined as a licensed physician, physician's assistant, or advanced practice registered nurse. Read More ›
airplane-flying-over-colorful-cargo-containers-stockpack-ado-1069008695-stockpack-adobe_stock
Airplane Flying Over Colorful Cargo Containers

A New Era of Economic Nationalism

The United States was founded on Free Trade. And, high tariffs. Domestically, the United States had exceptional economic policy. There were hardly any taxes, and the currency was reliably fixed to gold. Trade was Free between States. With what I've called "The Magic Formula" (Low Taxes and Stable Money), the US got richer — even with high tariffs with the rest of the world. Read More ›
people-praying-to-god-at-home-on-black-background-with-peopl-577817946-stockpack-adobe_stock
people praying to god at home on black background with people stock photo

Springs Rescue Mission: Spiritual Recovery Through Love, Not Force

A Springs Rescue Mission (SRM) document declares, "Our faith is why we do what we do, but faith is never required of others to receive basic relief services.…We believe it is God's job to change people, not ours." Old-style missions often thought they could change people by requiring attendance at chapel services. SRM does not have a campus church or any required service. Read More ›
Screenshot 2025-02-12 101134

Will We Be Haunted by a Non-Hallucinatory AI?

Lloyd Watts discusses the significant challenge posed by hallucinations in large language models (LLMs), such as ChatGPT. While these models often generate fluent and useful responses, they occasionally produce incorrect or misleading information, referred to as “hallucinations.” Watts highlights that this problem, acknowledged by major tech companies like Google, remains unsolved despite their advanced efforts. He argues that hallucinations are Read More ›