Click on any of the titles to read the full piece!
This Randall O’Reilly study challenges traditional ideas about how the hippocampus, our brain's memory center, works. Instead of simply strengthening connections between neurons that fire together (Hebbian learning), researchers found that using error correction learning leads to better memory formation. Their new model, called Theremin, showed improved memory capacity and explained why testing ourselves helps us learn better than just reviewing information.
This USC study saw researchers develop a new AI system called "Sentimental Agents," where multiple AI agents work together to make decisions, considering both logic and emotions. Testing it in a simulated job recruitment scenario showed that agents' feelings influenced their choices. The study shows promise for using emotional AI in real-world group decision-making situations like hiring or medical diagnostics.
This UW Seattle study looks at how machine learning can help us find and solve partial differential equations (PDEs), which are vital in understanding various physical phenomena like fluid flow and heat transfer. It focuses on discovering new PDEs from data, simplifying complex systems for more straightforward analysis, and improving how we calculate solutions. This approach combines traditional science with advanced computing to uncover new insights in biology, engineering, and more.
This Oxford study examines using machine learning (ML) to improve drug discovery, especially for small-molecule therapies. It highlights ML's potential to speed up this process but points out challenges, mainly due to limited and biased data. Researchers emphasize the importance of gathering comprehensive datasets and better testing methods to accurately measure ML's effectiveness in real-world scenarios, aiming for more reliable drug development practices.
This Oak Ridge National Laboratory study suggests that Neuromorphic Computing, which aims to mimic how the brain processes information, may be the next big thing. It has become increasingly important as traditional computing methods reach their physical limits. Neuromorphic computing is currently used in object and keyword recognition, with models such as IBM’s TrueNorth and Intel’s Loihi already being developed.
Want to submit a piece? Or trying to write a piece and struggling? Check out the guides here!
Thank you for reading. Reminder: Byte Sized is open to everyone! Feel free to submit your piece. Please read the guides first though.
All submissions to berkan@usc.edu with the header “Byte Sized Submission” in Word Doc format please. Thank you!