Feed aggregator

Maine scrambles as storm damage outpaces climate planning

ClimateWire News - Wed, 01/24/2024 - 6:20am
Recent storms have flooded inland communities, left hundreds of thousands without power, and damaged roads and piers.

Brazilian meat giant with mega emissions races to attract US investors

ClimateWire News - Wed, 01/24/2024 - 6:20am
JBS wants to join the New York Stock Exchange. The move could lead to rapid deforestation, critics say.

New tool predicts flood risk from hurricanes in a warming climate

MIT Latest News - Wed, 01/24/2024 - 6:00am

Coastal cities and communities will face more frequent major hurricanes with climate change in the coming years. To help prepare coastal cities against future storms, MIT scientists have developed a method to predict how much flooding a coastal community is likely to experience as hurricanes evolve over the next decades.

When hurricanes make landfall, strong winds whip up salty ocean waters that generate storm surge in coastal regions. As the storms move over land, torrential rainfall can induce further flooding inland. When multiple flood sources such as storm surge and rainfall interact, they can compound a hurricane’s hazards, leading to significantly more flooding than would result from any one source alone. The new study introduces a physics-based method for predicting how the risk of such complex, compound flooding may evolve under a warming climate in coastal cities.

One example of compound flooding’s impact is the aftermath from Hurricane Sandy in 2012. The storm made landfall on the East Coast of the United States as heavy winds whipped up a towering storm surge that combined with rainfall-driven flooding in some areas to cause historic and devastating floods across New York and New Jersey.

In their study, the MIT team applied the new compound flood-modeling method to New York City to predict how climate change may influence the risk of compound flooding from Sandy-like hurricanes over the next decades.  

They found that, in today’s climate, a Sandy-level compound flooding event will likely hit New York City every 150 years. By midcentury, a warmer climate will drive up the frequency of such flooding, to every 60 years. At the end of the century, destructive Sandy-like floods will deluge the city every 30 years — a fivefold increase compared to the present climate.

“Long-term average damages from weather hazards are usually dominated by the rare, intense events like Hurricane Sandy,” says study co-author Kerry Emanuel, professor emeritus of atmospheric science at MIT. “It is important to get these right.”

While these are sobering projections, the researchers hope the flood forecasts can help city planners prepare and protect against future disasters. “Our methodology equips coastal city authorities and policymakers with essential tools to conduct compound flooding risk assessments from hurricanes in coastal cities at a detailed, granular level, extending to each street or building, in both current and future decades,” says study author Ali Sarhadi, a postdoc in MIT’s Department of Earth, Atmospheric and Planetary Sciences.

The team’s open-access study appears online today in the Bulletin of the American Meteorological Society. Co-authors include Raphaël Rousseau-Rizzi at MIT’s Lorenz Center, Kyle Mandli at Columbia University, Jeffrey Neal at the University of Bristol, Michael Wiper at the Charles III University of Madrid, and Monika Feldmann at the Swiss Federal Institute of Technology Lausanne.

The seeds of floods

To forecast a region’s flood risk, weather modelers typically look to the past. Historical records contain measurements of previous hurricanes’ wind speeds, rainfall, and spatial extent, which scientists use to predict where and how much flooding may occur with coming storms. But Sarhadi believes that the limitations and brevity of these historical records are insufficient for predicting future hurricanes’ risks.

“Even if we had lengthy historical records, they wouldn’t be a good guide for future risks because of climate change,” he says. “Climate change is changing the structural characteristics, frequency, intensity, and movement of hurricanes, and we cannot rely on the past.”

Sarhadi and his colleagues instead looked to predict a region’s risk of hurricane flooding in a changing climate using a physics-based risk assessment methodology. They first paired simulations of hurricane activity with coupled ocean and atmospheric models over time. With the hurricane simulations, developed originally by Emanuel, the researchers virtually scatter tens of thousands of “seeds” of hurricanes into a simulated climate. Most seeds dissipate, while a few grow into category-level storms, depending on the conditions of the ocean and atmosphere.

When the team drives these hurricane simulations with climate models of ocean and atmospheric conditions under certain global temperature projections, they can see how hurricanes change, for instance in terms of intensity, frequency, and size, under past, current, and future climate conditions.

The team then sought to precisely predict the level and degree of compound flooding from future hurricanes in coastal cities. The researchers first used rainfall models to simulate rain intensity for a large number of simulated hurricanes, then applied numerical models to hydraulically translate that rainfall intensity into flooding on the ground during landfalling of hurricanes, given information about a region such as its surface and topography characteristics. They also simulated the same hurricanes’ storm surges, using hydrodynamic models to translate hurricanes’ maximum wind speed and sea level pressure into surge height in coastal areas. The simulation further assessed the propagation of ocean waters into coastal areas, causing coastal flooding.

Then, the team developed a numerical hydrodynamic model to predict how two sources of hurricane-induced flooding, such as storm surge and rain-driven flooding, would simultaneously interact through time and space, as simulated hurricanes make landfall in coastal regions such as New York City, in both current and future climates.  

“There’s a complex, nonlinear hydrodynamic interaction between saltwater surge-driven flooding and freshwater rainfall-driven flooding, that forms compound flooding that a lot of existing methods ignore,” Sarhadi says. “As a result, they underestimate the risk of compound flooding.”

Amplified risk

With their flood-forecasting method in place, the team applied it to a specific test case: New York City. They used the multipronged method to predict the city’s risk of compound flooding from hurricanes, and more specifically from Sandy-like hurricanes, in present and future climates. Their simulations showed that the city’s odds of experiencing Sandy-like flooding will increase significantly over the next decades as the climate warms, from once every 150 years in the current climate, to every 60 years by 2050, and every 30 years by 2099.

Interestingly, they found that much of this increase in risk has less to do with how hurricanes themselves will change with warming climates, but with how sea levels will increase around the world.

“In future decades, we will experience sea level rise in coastal areas, and we also incorporated that effect into our models to see how much that would increase the risk of compound flooding,” Sarhadi explains. “And in fact, we see sea level rise is playing a major role in amplifying the risk of compound flooding from hurricanes in New York City.”

The team’s methodology can be applied to any coastal city to assess the risk of compound flooding from hurricanes and extratropical storms. With this approach, Sarhadi hopes decision-makers can make informed decisions regarding the implementation of adaptive measures, such as reinforcing coastal defenses to enhance infrastructure and community resilience.

“Another aspect highlighting the urgency of our research is the projected 25 percent increase in coastal populations by midcentury, leading to heightened exposure to damaging storms,” Sarhadi says. “Additionally, we have trillions of dollars in assets situated in coastal flood-prone areas, necessitating proactive strategies to reduce damages from compound flooding from hurricanes under a warming climate.”

This research was supported, in part, by Homesite Insurance.

New model predicts how shoe properties affect a runner’s performance

MIT Latest News - Wed, 01/24/2024 - 12:00am

A good shoe can make a huge difference for runners, from career marathoners to couch-to-5K first-timers. But every runner is unique, and a shoe that works for one might trip up another. Outside of trying on a rack of different designs, there’s no quick and easy way to know which shoe best suits a person’s particular running style.

MIT engineers are hoping to change that with a new model that predicts how certain shoe properties will affect a runner’s performance.

The simple model incorporates a person’s height, weight, and other general dimensions, along with shoe properties such as stiffness and springiness along the midsole. With this input, the model then simulates a person’s running gait, or how they would run, in a particular shoe.

Using the model, the researchers can simulate how a runner’s gait changes with different shoe types. They can then pick out the shoe that produces the best performance, which they define as the degree to which a runner’s expended energy is minimized.

While the model can accurately simulate changes in a runner’s gait when comparing two very different shoe types, it is less discerning when comparing relatively similar designs, including most commercially available running shoes. For this reason, the researchers envision the current model would be best used as a tool for shoe designers looking to push the boundaries of sneaker design.

“Shoe designers are starting to 3D print shoes, meaning they can now make them with a much wider range of properties than with just a regular slab of foam,” says Sarah Fay, a postdoc in MIT’s Sports Lab and the Institute for Data, Systems, and Society (IDSS). “Our model could help them design really novel shoes that are also high-performing.”

The team is planning to improve the model, in hopes that consumers can one day use a similar version to pick shoes that fit their personal running style.

“We’ve allowed for enough flexibility in the model that it can be used to design custom shoes and understand different individual behaviors,” Fay says. “Way down the road, we imagine that if you send us a video of yourself running, we could 3D print the shoe that’s right for you. That would be the moonshot.”

The new model is reported in a study appearing this month in the Journal of Biomechanical Engineering. The study is authored by Fay and Anette “Peko” Hosoi, professor of mechanical engineering at MIT.

Running, revamped

The team’s new model grew out of talks with collaborators in the sneaker industry, where designers have started to 3D print shoes at commercial scale. These designs incorporate 3D-printed midsoles that resemble intricate scaffolds, the geometry of which can be tailored to give a certain bounce or stiffness in specific locations across the sole.

“With 3D printing, designers can tune everything about the material response locally,” Hosoi says. “And they came to us and essentially said, ‘We can do all these things. What should we do?’”

“Part of the design problem is to predict what a runner will do when you put an entirely new shoe on them,” Fay adds. “You have to couple the dynamics of the runner with the properties of the shoe.”

Fay and Hosoi looked first to represent a runner’s dynamics using a simple model. They drew inspiration from Thomas McMahon, a leader in the study of biomechanics at Harvard University, who in the 1970s used a very simple “spring and damper” model to model a runner’s essential gait mechanics. Using this gait model, he predicted how fast a person could run on various track types, from traditional concrete surfaces to more rubbery material. The model showed that runners should run faster on softer, bouncier tracks that supported a runner’s natural gait.

Though this may be unsurprising today, the insight was a revelation at the time, prompting Harvard to revamp its indoor track — a move that quickly accumulated track records, as runners found they could run much faster on the softier, springier surface.

“McMahon’s work showed that, even if we don’t model every single limb and muscle and component of the human body, we’re still able to create meaningful insights in terms of how we design for athletic performance,” Fay says.

Gait cost

Following McMahon’s lead, Fay and Hosoi developed a similar, simplified model of a runner’s dynamics. The model represents a runner as a center of mass, with a hip that can rotate and a leg that can stretch. The leg is connected to a box-like shoe, with springiness and shock absorption that can be tuned, both vertically and horizontally.

They reasoned that they should be able to input into the model a person’s basic dimensions, such as their height, weight, and leg length, along with a shoe’s material properties, such as the stiffness of the front and back midsole, and use the model to simulate what a person’s gait is likely to be when running in that shoe.

But they also realized that a person’s gait can depend on a less definable property, which they call the “biological cost function” — a quality that a runner might not consciously be aware of but nevertheless may try to minimize whenever they run. The team reasoned that if they can identify a biological cost function that is general to most runners, then they might predict not only a person’s gait for a given shoe but also which shoe produces the gait corresponding to the best running performance.

With this in mind, the team looked to a previous treadmill study, which recorded detailed measurements of runners, such as the force of their impacts, the angle and motion of their joints, the spring in their steps, and the work of their muscles as they ran, each in the same type of running shoe.

Fay and Hosoi hypothesized that each runner’s actual gait arose not only from their personal dimensions and shoe properties, but also a subconscious goal to minimize one or more biological measures, yet unknown. To reveal these measures, the team used their model to simulate each runner’s gait multiple times. Each time, they programmed the model to assume the runner minimized a different biological cost, such as the degree to which they swing their leg or the impact that they make with the treadmill. They then compared the modeled gait with the runner’s actual gait to see which modeled gait — and assumed cost — matched the actual gait.

In the end, the team found that most runners tend to minimize two costs: the impact their feet make with the treadmill and the amount of energy their legs expend.

“If we tell our model, ‘Optimize your gait on these two things,’ it gives us really realistic-looking gaits that best match the data we have,” Fay explains. “This gives us confidence that the model can predict how people will actually run, even if we change their shoe.”

As a final step, the researchers simulated a wide range of shoe styles and used the model to predict a runner’s gait and how efficient each gait would be for a given type of shoe.

“In some ways, this gives you a quantitative way to design a shoe for a 10K versus a marathon shoe,” Hosoi says. “Designers have an intuitive sense for that. But now we have a mathematical understanding that we hope designers can use as a tool to kickstart new ideas.”

This research is supported, in part, by adidas.

Fragging: The Subscription Model Comes for Gamers

EFF: Updates - Tue, 01/23/2024 - 7:24pm

We're taking part in Copyright Week, a series of actions and discussions supporting key principles that should guide copyright policy. Every day this week, various groups are taking on different elements of copyright law and policy, addressing what's at stake and what we need to do to make sure that copyright promotes creativity and innovation.

The video game industry is undergoing the same concerning changes we’ve seen before with film and TV, and it underscores the need for meaningful digital ownership.

Twenty years ago you owned DVDs. Ten years ago you probably had a Netflix subscription with a seemingly endless library. Now, you probably have two to three subscription services, and regularly hear about shows and movies you can no longer access, either because they’ve moved to yet another subscription service, or because platforms are delisting them all together.

The video game industry is getting the same treatment. While it is still common for people to purchase physical or digital copies of games, albeit often from within walled gardens like Steam or Epic Games, game subscriptions are becoming more and more common. Like the early days of movie streaming, services like Microsoft Game Pass or PlayStation Plus seem to offer a good deal. For a flat monthly fee, you have access to seemingly unlimited game choices. That is, for now.

In a recent announcement from game developer Ubisoft, their director of subscriptions said plainly that a goal of their subscription service’s rebranding is to get players “comfortable” with not owning their games. Notably, this is from a company which had developed five non-mobile games last year, hoping users will access them and older games through a $17.99 per month subscription; that is, $215.88 per year. And after a year, how many games does the end user actually own? None. 

This fragmentation of the video game subscription market isn’t just driven by greed, but answering a real frustration from users the industry itself has created. Gamers at one point could easily buy and return games, they could rent games they were only curious about, and even recoup costs by reselling their game. With the proliferation of DRM and walled-garden game vendors, ownership rights have been eroded. Reselling or giving away a copy of your game, or leaving it for your next of kin, is no longer permitted. The closest thing to a rental now available is a game demo (if it exists) or playing a game within the time frame necessary to get a refund (if a storefront offers one). These purchases are also put at risk as games are sometimes released incomplete beyond this time limit. Developers such as Ubisoft will also shut down online services which severely impact the features of these games, or even make them unplayable.

DRM and tightly controlled gaming platforms also make it harder to mod or tweak games in ways the platform doesn’t choose to support. Mods are a thriving medium for extending the functionalities, messages, and experiences facilitated by a base game, one where passion has driven contributors to design amazing things with a low barrier to entry. Mods depend on users who have the necessary access to a work to understand how to mod it and to deploy mods when running the program. A model wherein the player can only access these aspects of the game in the ways the manufacturer supports undermines the creative rights of owners as well.

This shift should raise alarms for both users and creators alike. With publishers serving as intermediaries, game developers are left either struggling to reach their audience, or settling for a fraction of the revenue they could receive from traditional sales. 

We need to preserve digital ownership before we see video games fall into the same cycles as film and TV, with users stuck paying more and receiving not robust ownership, but fragile access on the platform’s terms.

What to do about AI in health?

MIT Latest News - Tue, 01/23/2024 - 4:25pm

Before a drug is approved by the U.S. Food and Drug Administration (FDA), it must demonstrate both safety and efficacy. However, the FDA does not require an understanding a drug’s mechanism of action for approval. This acceptance of results without explanation raises the question of whether the "black box" decision-making process of a safe and effective artificial intelligence model must be fully explained in order to secure FDA approval.  

This topic was one of many discussion points addressed on Monday, Dec. 4 during the MIT Abdul Latif Jameel Clinic for Machine Learning in Health (Jameel Clinic) AI and Health Regulatory Policy Conference, which ignited a series of discussions and debates amongst faculty; regulators from the United States, EU, and Nigeria; and industry experts concerning the regulation of AI in health. 

As machine learning continues to evolve rapidly, uncertainty persists as to whether regulators can keep up and still reduce the likelihood of harmful impact while ensuring that their respective countries remain competitive in innovation. To promote an environment of frank and open discussion, the Jameel Clinic event’s attendance was highly curated for an audience of 100 attendees debating through the enforcement of the Chatham House Rule, to allow speakers anonymity for discussing controversial opinions and arguments without being identified as the source. 

Rather than hosting an event to generate buzz around AI in health, the Jameel Clinic's goal was to create a space to keep regulators apprised of the most cutting-edge advancements in AI, while allowing faculty and industry experts to propose new or different approaches to regulatory frameworks for AI in health, especially for AI use in clinical settings and in drug development. 

AI’s role in medicine is more relevant than ever, as the industry struggles with a post-pandemic labor shortage, increased costs (“Not a salary issue, despite common belief,” said one speaker), as well as high rates of burnout and resignations among health care professionals. One speaker suggested that priorities for clinical AI deployment should be focused more on operational tooling rather than patient diagnosis and treatment. 

One attendee pointed out a “clear lack of education across all constituents — not just amongst developer communities and health care systems, but with patients and regulators as well.” Given that medical doctors are often the primary users of clinical AI tools, a number of the medical doctors present pleaded with regulators to consult them before taking action. 

Data availability was a key issue for the majority of AI researchers in attendance. They lamented the lack of data to make their AI tools work effectively. Many faced barriers such as intellectual property barring access or simply a dearth of large, high-quality datasets. “Developers can’t spend billions creating data, but the FDA can,” a speaker pointed out during the event. “There’s a price uncertainty that could lead to underinvestment in AI.” Speakers from the EU touted the development of a system obligating governments to make health data available for AI researchers. 

By the end of the daylong event, many of the attendees suggested prolonging the discussion and praised the selective curation and closed environment, which created a unique space conducive to open and productive discussions on AI regulation in health. Once future follow-up events are confirmed, the Jameel Clinic will develop additional workshops of a similar nature to maintain the momentum and keep regulators in the loop on the latest developments in the field.

“The North Star for any regulatory system is safety,” acknowledged one attendee. “Generational thought stems from that, then works downstream.” 

Rowing in the right direction

MIT Latest News - Tue, 01/23/2024 - 4:10pm

For a college student, senior Tatum Wilhelm wakes up painfully early — at 5:15 a.m., to be exact. Five days per week, by 6:20 a.m. sharp, she is already rowing on the Charles River, bursting through the early morning fog. 

Between majoring in chemical engineering, minoring in anthropology, and working as an undergraduate student researcher at the Furst Lab, Wilhelm’s days are packed. But she says it’s her role on MIT Crew that gives her perspective on her goals and what matters most.  

Stretching her arms after a workout on the erg, the unforgiving indoor rowing machine used for individual training, she explains, “Crew is a set time in the day when I’m not thinking about academics. I’m just focused on pushing myself physically — and the river is beautiful.” 

She was captain of her team last year, but winning isn’t the current that pulls Wilhelm deeper and deeper into her sport; it’s teamwork. 

“When I first came here, I had the preconception that everyone at MIT was a genius and super into their books,” she says. “They are very smart, but everyone also does really cool stuff outside of academics. My favorite thing about this school is the people — especially my team.” 

Fitting in

A first-generation college student raised by a single mom, Wilhelm came to MIT from California with the support of Questbridge, a nonprofit that mentors high-achieving, low-income students as they apply early decision to their top-choice colleges. She was passionate about science and knew that MIT was the right place, but she didn’t know a soul on campus. 

It’s Wilhelm’s friendships, both in the lab and in the eight-person boat, that have given her a feeling of belonging. 

“Before I got to MIT, I honestly didn’t know what an engineer was,” she says bluntly. 

But once Wilhelm saw engineering alumni solving real-world problems in the field, she knew it was for her, ultimately choosing chemical engineering. 

When Covid-19 hit the spring of her first year and remained virtual for the fall 2020 semester, Wilhelm temporarily relocated to Alaska, where she worked as a farm hand and learned about sustainable agriculture. “I am an engineer — not a farmer. I am also not that outdoorsy, and that experience pushed me way out of my academic comfort zone in a great way,” Wilhelm says. 

During that time, she began working remotely as an undergraduate researcher in the Furst Lab, logging on between shifts in the fields to meet with Assistant Professor Ariel Furst, who actively included her as one of the team from the start. 

Back in Cambridge as a sophomore, Wilhelm unexpectedly discovered a passion for anthropology when she signed up for class 21A.157 (The Meaning of Life), a seminar taught by William R. Kenan Jr. Professor of Anthropology Heather Paxson.

Wilhelm admits, “I thought the class would be too philosophical, but it was actually extremely applicable to things that were going on in students’ lives. It was about finding personal meaning in work, family, and money in tangible ways.” At the time, the whole world was still reeling from Covid-19, and being able to conduct that kind of soul-searching became a powerful tool. 

“I just kept going with the anthro courses and soon had collected enough for a minor,” Wilhelm says. “They complement my chemical engineering classes, which are very technical and centered around problem-solving.” 

Real-world chemical engineering

Wilhelm spent her junior year studying thermodynamics and fluid dynamics in the Department of Chemical Engineering (ChemE), as well as class 21A.520 (Magic, Science, and Religion), a seminar with professor of anthropology Graham Jones. The contrast both stretched and soothed her brain. She says Jones’s engaging style of teaching made him her favorite MIT professor.

This fall, Wilhelm took a class called 21A.301 (Disease and Health) with associate professor of anthropology Amy Moran-Thomas. Discussions about the biopharmaceutical industry and analyzing modes of care directly connected with her ChemE coursework and internships, and gave her perspective on how her future work can impact real-world users. She reflects, “Looking at how these treatments impact patients’ lives has provided a deeper understanding of the implications of my work. I value being able to look at very technical scientific problems from a humanities lens, and I think it has enhanced my learning in both disciplines.” 

Alongside her academic studies, Wilhelm has continued working at the Furst Lab, more recently with the support of MIT SuperUROP. The competitive program provides advanced undergraduates with independent research opportunities. 

With this funding, Wilhelm is conducting a project to examine how to potentially engineer cell-based electrochemical lanthanide sensors. Lanthanides are rare-earth elements used in several industries, including electronics and green energy, primarily due to their abundance and low cost. 

Wilhelm explains, “The current methods for the separation of lanthanides in mining and recycling are costly and environmentally damaging. This project aims to create an inexpensive and environmentally-friendly method for detecting and recovering lanthanides from complex solutions."

At MIT, she has noticed some interesting parallels between being part of the crew team and sharing the lab with researchers of different ages and backgrounds. In both settings, failing, iterating, and ultimately winning frame the culture. 

She says, “In the lab, there is an overarching sense of purpose, which also translates to crew. In rowing, we are all working together. We train both individually and as a team. Our performance as individuals matters, but we ultimately have to all come together to move the boat forward.” 

Next year, Wilhelm hopes to steer toward a PhD in chemical engineering or material science. 

“I’m really interested in the industry applications of ChemE, but in reality, I just want to continue researching and learning new things every day right now,” she says.

Professor Emeritus Peter Schiller, a pioneer researcher of the visual system, dies at 92

MIT Latest News - Tue, 01/23/2024 - 3:45pm

Peter Schiller, professor emeritus in the Department of Brain and Cognitive Sciences and a member of the MIT faculty since 1964, died on Dec. 23, 2023. He was 92.

Born in Berlin to Hungarian parents in 1931, Schiller and his family returned to Budapest in 1934, where they endured World War II; in 1947 he moved to the United States with his father and stepmother. Schiller attended college at Duke University, where he was on the soccer and tennis teams and received his bachelor’s degree in 1955. He then went on to earn his PhD with Morton Weiner at Clark University, where he studied cortical involvement in visual masking. In 1962, he came to what was then the Department of Psychology at MIT for postdoctoral research. Schiller was appointed an assistant professor in 1964 and full professor in 1971. He was appointed to the Dorothy Poitras Chair for Medical Physiology in 1986 and retired in 2013.

“Peter Schiller was a towering figure in the field of visual neurophysiology,” says Mriganka Sur, the Newton Professor of Neuroscience. “He was one of the pioneers of experimental studies in nonhuman primates, and his laboratory, together with those of Emilio Bizzi and Ann Graybiel, established MIT as a leading center of research in brain mechanisms of visual and motor function.”

Recalls John Maunsell, the Albert D. Lasker Distinguished Service Professor of Neurobiology at the University of Chicago, who did postdoctoral research with Schiller, “Peter was the boldest experimentalist I’ve ever known. Once he engaged with a question, he was unintimidated by how exacting, intricate, or extensive the required experiments might be. Over the years he produced an impressive range of results that others viewed as beyond reach.” 

Schiller’s former PhD student Michael Stryker, the W.F. Ganong Professor of Physiology at the University of California at San Francisco, writes, “Schiller was merciless in his criticism of weakly supported conclusions, whether by students or by major figures in the field. He demanded good data, real measurements, no matter how hard they were to make.”

Schiller’s research spanned multiple areas. As a graduate student, he designed an apparatus, the five-field tachitoscope, that rigorously controlled the timing and sequence of images shown to each eye in order to study visual masking and the generation of optical illusions. With it, Schiller demonstrated that several well-known optical illusions are generated in the cortex of the brain rather than by processes in the peripheral visual system.

Seeking postdoctoral research, he turned to his father’s friend, Hans-Lukas Teuber, who had just accepted an offer to be founding head of the Department of Psychology at MIT. Schiller learned how to make single-unit electrophysiological recordings from the brains of awake animals, which added a new dimension to his studies of the circuitry and mechanisms of cortical processing in the visual system. Among other findings, he saw that brightness masking in the visual system was caused by interactions among retinal neurons, in contrast to the cortical mechanism of illusions.

In 1964, Schiller was appointed assistant professor. Soon after, he embarked on productive collaborations with Emilio Bizzi, who had just arrived in the Department of Psychology. Schiller and Bizzi, who is now an Institute Professor Emeritus, shared an interest in the neural control of movement; they set to work on the oculomotor system and how it guides saccades, the rapid eye movements that center objects of interest in the visual field. They quantified the firing patterns of motor neurons that generate saccadic eye movements; paired with studies of the superior colliculus, the brain center that guides saccades in primates, and the frontal eye fields of the cortex, they outlined a fundamental scheme for the control of saccades, in which one system identifies targets in the visual scene and another generates eye movements to direct the gaze toward the target.

Continuing his dissection of visual circuitry, Schiller and his colleagues traced the connections that two different types of retinal cells, known as parasol cells and midget cells, send from the retina to the lateral geniculate nucleus of the thalamus. They discovered that each cell type connects to a different area, and that this physical segregation reflects a functional difference: Midget cells process color and fine texture while parasol cells carry motion and depth information. He then turned to the ON and OFF channels of the visual system — channels originating in different types of retinal neurons: some which respond to the onset of light, others that respond to the offset of light, and others that respond to both on and off. Building on earlier work by others, and inspired by recent discoveries of ways to pharmacologically isolate ON and OFF systems, Schiller and several of his students extended the previous studies to primates and developed an explanation for the evolutionary benefit of what seems at first like a paradoxical system: that the ON/OFF system allows animals to perceive both increments and decrements in contrast and brightness more rapidly, a beneficial attribute if those shifts, for instance, represent the approach of a predator.

At the same time, the Schiller lab delved further into the role of various parts of the cortex in visual processing, especially the areas known as V4 and MT, later steps in visual processing pathways. Through single-neuron recordings and by making lesions in specific areas of the brain in the animals they studied, they revealed that area V4 has a major role in the selection of visual targets that are smaller or have lower contrast compared to other stimuli in a scene, an ability that, for example, helps an animal unmask a camouflaged predator or prey. Strikingly, he showed that many variations in images that are important for perception have a delayed influence on the responses of neurons in the primary visual cortex, indicating that they are produced by feedback from higher stages of visual processing.

Schiller’s many significant contributions to vision science were recognized with his election to the National Academy of Sciences and the American Academy of Arts and Sciences in 2007, and, in his home country, he was made an honorary member of the Magyar Tudományos Akadémia, the Hungarian Academy of Sciences, in 2008.

Schiller’s legacy is also evident in his students and trainees. Schiller counted more than 50 students and postdocs who passed through his lab in its 50 years. Four of his trainees have since been elected to the National Academy of Sciences: graduate students Larry Squire and Stryker, and postdocs Maunsell and Nikos Logothetis.

His mentorship also extended to faculty colleagues, recalls Picower professor of neuroscience Earl Miller: “He generously took me under his wing when I began at MIT, offering invaluable advice that steered me in the right direction. I will forever be grateful to him. His mentorship style was not coddling. It was direct and frank, just like Peter always was. I remember early in my nascent career when I was rattled by finding myself in a scientific disagreement with a senior investigator. Peter calmed me down, in his way. He said, ‘Don't worry, controversy is great for a career.’ But he quickly added, ‘As long as you are right; otherwise, well ...’

Schiller’s creative streak did not just influence his scientific thinking; he was an accomplished guitar and piano player, and he loved building complex and abstract sculptures, many of them constructed from angular pieces of colored glass. He is survived by his three children, David, Kyle, and Sarah, and five grandchildren. His wife, Ann Howell, died in 1999.

EFF and More Than 100+ NGOS Set Non-Negotiable Redlines Ahead of UN Cybercrime Treaty Negotiations

EFF: Updates - Tue, 01/23/2024 - 9:44am

EFF has joined forces with 110 NGOs today in a joint statement delivered to the United Nations Ad Hoc Committee, clearly outlining civil society non-negotiable redlines for the proposed UN Cybercrime Treaty, and asserting that states should reject the proposed treaty if these essential changes are not implemented. 

The last draft published on November 6, 2023 does not adequately ensure adherence to human rights law and standards. Initially focused on cybercrime, the proposed Treaty has alarmingly evolved into an expansive surveillance tool.

Katitza Rodriguez, EFF Policy Director for Global Privacy, asserts ahead of the upcoming concluding negotiations:

The proposed treaty needs more than just minor adjustments; it requires a more focused, narrowly defined approach to tackle cybercrime. This change is essential to prevent the treaty from becoming a global surveillance pact rather than a tool for effectively combating core cybercrimes. With its wide-reaching scope and invasive surveillance powers, the current version raises serious concerns about cross-border repression and potential police overreach. Above all, human rights must be the treaty's cornerstone, not an afterthought. If states can't unite on these key points, they must outright reject the treaty.

Historically, cybercrime legislation has been exploited to target journalists and security researchers, suppress dissent and whistleblowers, endanger human rights defenders, limit free expression, and justify unnecessary and disproportionate state surveillance measures. We are concerned that the proposed Treaty, as it stands now, will exacerbate these problems. The proposed treaty concluding session will be held at the UN Headquarters in New York from January 29 to February 10th. EFF will be attending in person.

The joint statement specifically calls States to narrow the scope of criminalization provisions to well defined cyber dependent crimes; shield security researchers, whistleblowers, activists, and journalists from being prosecuted for their legitimate activities; explicitly include language on international human rights law, data protection, and gender mainstreaming; limit the scope of the domestic criminal procedural measures and international cooperation to core cybercrimes established in the criminalization chapter; and address concerns that the current draft could weaken cybersecurity and encryption. Additionally, it requires the necessity to establish specific safeguards, such as the principles of prior judicial authorization, necessity, legitimate aim, and proportionality.

Side Channels Are Common

Schneier on Security - Tue, 01/23/2024 - 7:09am

Really interesting research: “Lend Me Your Ear: Passive Remote Physical Side Channels on PCs.”

Abstract:

We show that built-in sensors in commodity PCs, such as microphones, inadvertently capture electromagnetic side-channel leakage from ongoing computation. Moreover, this information is often conveyed by supposedly-benign channels such as audio recordings and common Voice-over-IP applications, even after lossy compression.

Thus, we show, it is possible to conduct physical side-channel attacks on computation by remote and purely passive analysis of commonly-shared channels. These attacks require neither physical proximity (which could be mitigated by distance and shielding), nor the ability to run code on the target or configure its hardware. Consequently, we argue, physical side channels on PCs can no longer be excluded from remote-attack threat models...

Benito the giraffe moved to central Mexico for better weather

ClimateWire News - Tue, 01/23/2024 - 6:47am
Environmental groups had voiced strong complaints about conditions faced by Benito at the city-run zoo in Ciudad Juarez, across from El Paso, Texas, where weather in the summer is brutally hot.

Winter storm batters UK, Ireland; thousands left without power

ClimateWire News - Tue, 01/23/2024 - 6:46am
Ireland and the U.K. have been hammered since fall by a series of gusty and wet storms that have knocked out power and caused flooding along river valleys.

Landslide in China buries 47 in freezing temperatures, snow

ClimateWire News - Tue, 01/23/2024 - 6:45am
The cause of the landslide wasn't immediately known as survivors and rescuers struggled with snow, icy roads and freezing temperatures that were forecast to persist for at least the next three days.

Q&A: California Senate budget chief on carbon removal, state deficit

ClimateWire News - Tue, 01/23/2024 - 6:45am
The state Senate budget subcommittee chair will play a crucial role as groups jockey for their climate policy priorities.

Delaware fights climate court loss against oil companies

ClimateWire News - Tue, 01/23/2024 - 6:44am
A judge earlier this month tossed out much of the state’s lawsuit seeking to hold BP, Chevron and other oil producers financially responsible for climate change.

Exxon sues to block shareholder climate proposal

ClimateWire News - Tue, 01/23/2024 - 6:43am
The energy company is asking a federal judge for permission to prevent shareholders from voting on a resolution that would make Exxon cut its carbon emissions faster.

EPA interviews nonprofits to run $14B 'green bank'

ClimateWire News - Tue, 01/23/2024 - 6:42am
Five nonprofits have applied to help disperse funds from one of the largest programs created in the Inflation Reduction Act.

Meet the 'conservative influencer' trying to upend Washington's cap-and-trade system

ClimateWire News - Tue, 01/23/2024 - 6:41am
Hedge fund executive Brian Heywood has bankrolled six ballot initiatives, including one to repeal the state's signature climate policy.

Author Correction: Dependence of economic impacts of climate change on anthropogenically directed pathways

Nature Climate Change - Tue, 01/23/2024 - 12:00am

Nature Climate Change, Published online: 23 January 2024; doi:10.1038/s41558-024-01930-6

Author Correction: Dependence of economic impacts of climate change on anthropogenically directed pathways

Extinction of experience due to climate change

Nature Climate Change - Tue, 01/23/2024 - 12:00am

Nature Climate Change, Published online: 23 January 2024; doi:10.1038/s41558-023-01920-0

Ongoing climate change has the potential to reduce people’s direct experiences with nature, leading to or further exacerbating the ‘extinction of experience’. We argue that understanding these impacts is crucial, as the extinction of experience can have adverse consequences for both humans and the natural environment.

Pages