Connect with us

Tech

The future of urban housing is energy-efficient refrigerators

Published

on

explosion model of a retrofit


Katerra’s all-encompassing vision of reforming the construction world, using billions of dollars in investment to build an entirely new production system from the ground up, showcased stereotypical Silicon Valley arrogance. It also has had a fraction of the impact of European models that seek to retrofit using a simple, straightforward, and standard set of parts. 

The company shared a common blind spot with many American technologists, according to Gerard McCaughey, a serial entrepreneur and founder of Century Homes, an Irish pioneer of off-site construction: it disregarded innovation pioneered overseas. While American construction favored wood-frame building on-site with readily available raw materials—picture a Ford pickup piled with two-by-fours pulling up to a lot—more space- and material-­constrained builders in Asia and Europe have perfected prefab and modular techniques. Katerra ignored these examples, which slowly built up expertise by focusing on specific sectors one at a time. Instead, it tried to reinvent the wheel, bringing every facet of the complex construction process in-house and building too many different models at once and causing massive cost overruns. 

“It’s not what you know or what you don’t know that catches you,” says McCaughey, who held talks with Katerra leaders. “There were things they were dead certain you needed to do, but [they were wrong]. Off-site isn’t a one-trick pony. You have to crawl before you can walk. The least experienced guy in my company knew more about off-site construction than their senior leadership.” 

0. R38 effective envelope 1. Glazing with a low solar heat gain coefficient 2. Low-emissivity interior shades 3. Ceiling fans to circulate air within units 4. Lightly tempered air delivered through a centralized ventilation system 5. Decentralized cooling “boost” through a variable air volume unit activated by in-suite controls

Many efforts are underway to decarbonize buildings. An example is the Holistic Energy and Architectural Retrofit Toolkit (HEART), a cloud-based computing platform that includes decision-making and energy management features.

MEREDITH MIOTKE

The Energiesprong model, which has retrofitted thousands of homes in the Netherlands and across Europe, relies on Stroomversnelling (the name means “rapid acceleration”), a network in which contractors, housing associations, parts suppliers, and even financiers work in close contact—a level of coordination that even Katerra’s sprawling system didn’t match. Right now, the Energiesprong system can redo a building in roughly 10 days. Other startups and construction companies offer complimentary upgrades: Dutch firm Factory Zero, for example, makes prebuilt modules for roofs boasting electric boilers, heat pumps, and solar hookups. The greening of an older building is nearly plug-and-play. 

It’s part of a larger European model that starts with an ambitious emissions policy and supports it with incentives and funding for retrofits and new buildings via programs like Horizon Europe, in effect subsidizing novel building methods and creating a market for innovative windows, doors, and HVAC systems. A key component of its success has been governments’ willingness to fund such upgrades for subsidized and public housing, typically postwar towers and townhomes in desperate need of improvement. But there are also other significant advantages in Europe: building codes are much more standardized across countries and the continent as a whole, including some progressive regulations pushing for the passivhaus standard, an ultra-efficient level of insulation and ventilation that drastically reduces the energy needed for heating and cooling. The entire housing ecosystem is smaller and more standardized too, making it easier to support more experiments. Energiesprong uses a single building model, a handful of contractors, and a relatively small pool of players across a small area. 

Coordination would be exponentially harder in a single US city, much less the entire nation. “Europe takes a shotgun approach and funds numerous programs across the board,” says Michael Eliason, a Seattle-based sustainable building expert and founder of Larch Lab, a design studio and think tank. It’s an approach that spreads risk among different ideas, as opposed to concentrating venture capital on a handful of single-minded hypergrowth startups. “The US ends up being kind of a sniper rifle,” he says. “Katerra fails and it impacts the entire prefab construction industry.” 

An emerging model in Canada seeks to replicate Europe’s. CityHousing Hamilton, the municipal housing authority for the Ontario city, recently used national housing funds for a full retrofit of Ken Soble Tower, a waterfront high-rise for seniors that was built in 1967 and had fallen into disrepair. The project, which incorporated panelized exterior cladding, new high-efficiency windows, and electrification of heating and gas stoves, brought the building to the passivhaus standard; with a 94% reduction in energy usage thanks to extreme efficiency, the total energy needed to cool and heat a unit is equivalent to three incandescent light bulbs. Gracious new bay windows offering seating, sweeping views, and daylight suggest there was no aesthetic price to pay.

Graeme Stewart of ERA Architects, who led the project and has studied the nation’s hundreds of similar midcentury high-rises, says the project gave business to Canadian firms manufacturing high-tech windows and cladding, suggesting that such work could help seed a domestic industry for more green building projects. He’s even spearheaded creation of the Tower Renewal Partnership, an organization dedicated to pursuing similar retrofits across Canada. But CityHousing Hamilton’s development manager, Sean Botham, says that even with all the benefits they’re seeing for the tower residents—better air quality, infection control, mental health, and cognitive function, and “views you just don’t get in social housing”—the agency isn’t likely to pay the 8% cost premium to upgrade other buildings in its portfolio without more funding support. 

Tech

Deep learning can almost perfectly predict how ice forms

Published

on

Deep learning can almost perfectly predict how ice forms


Researchers have used deep learning techniques to model how ice crystals form in the atmosphere with much higher precision than ever before. Their paper, published this week in PNAS, hints at the potential for the new method to significantly increase the accuracy of weather and climate forecasting.

The researchers used deep learning to predict how atoms and molecules behave. First, deep learning models were trained on small-scale simulations of 64 water molecules to help them predict how electrons in atoms interact. The models then replicated those interactions on a larger scale, with more atoms and molecules. It’s this ability to precisely simulate electron interactions that allowed the team to accurately predict physical and chemical behavior. 

“The properties of matter emerge from how electrons behave,” says Pablo Piaggi, a research fellow at Princeton University and the lead author on the study. “Simulating explicitly what happens at that level is a way to capture much more rich physical phenomena.”

It’s the first time this method has been used to model something as complex as the formation of ice crystals, also known as ice nucleation. This development may eventually improve the accuracy of weather and climate forecasting, because the formation of ice crystals is one of the first steps in the formation of clouds, which is where all precipitation comes from. 

Xiaohong Liu, professor of atmospheric sciences at Texas A&M University, who was not involved in the study, says half of all precipitation events—whether it’s snow or rain or sleet—begin as ice crystals, which then grow larger and result in precipitation. If researchers can model ice nucleation more accurately, it could give a big boost to weather prediction overall.

Ice nucleation is currently predicted based on laboratory experiments. Researchers collect data on ice formation under different laboratory conditions, and that data is fed into weather prediction models under similar real-world conditions. This method works well enough sometimes, but often ends up being inaccurate because of the sheer number of variables in real-world conditions. If even a few factors vary between the lab and actual conditions, the results can be quite different.

“Your data is only valid for a certain region, temperature, or kind of laboratory setting,” Liu says.

Basing ice nucleation on how electrons interact is much more precise, but it’s also extremely computationally expensive. Predicting ice nucleation requires researchers to model at least 4000 to 100,000 water molecules, which even on supercomputers could take years to run. And even that would only be able to model the interactions for 100 picoseconds, or 10-10 seconds, not enough to observe the ice nucleation process.

Using deep learning, however, researchers were able to run the calculations in just 10 days. The time duration was also 1,000 times longer—still a fraction of a second, but just enough to see the ice nucleation process.

Of course, more accurate ice nucleation models alone won’t make weather forecasting perfect, says Liu. Ice nucleation is only a small but critical component of weather modeling. Other aspects, like understanding how water droplets and ice crystals grow, and how they move and interact together under different conditions, is also important.

Still, the ability to more accurately model how ice crystals form in the atmosphere would significantly improve weather predictions, especially whether it’s likely to rain or snow, and by how much. It could also improve climate forecasting by improving the ability to model clouds, which are vital players in the absorption of sunlight and abundance of greenhouse gasses.

Piaggi says future research could model ice nucleation when there are substances like smoke in the air, which can improve the accuracy of models even more. Because of deep learning techniques, it’s now possible to use electron interactions to model larger systems for longer periods of time.

“That has opened essentially a new field,” Piaggi says. “It’s already having and will have an even greater role in simulations in chemistry and in our simulations of materials.”

Continue Reading

Tech

How to craft effective AI policy

Published

on

How to craft effective AI policy


So to your first question, I think you’re right. That policy makers should actually define the guardrails, but I don’t think they need to do it for everything. I think we need to pick those areas that are most sensitive. The EU has called them high risk. And maybe we might take from that, some models that help us think about what’s high risk and where should we spend more time and potentially policy makers, where should we spend time together?

I’m a huge fan of regulatory sandboxes when it comes to co-design and co-evolution of feedback. Uh, I have an article coming out in an Oxford University press book on an incentive-based rating system that I could talk about in just a moment. But I also think on the flip side that all of you have to take account for your reputational risk.

As we move into a much more digitally advanced society, it is incumbent upon developers to do their due diligence too. You can’t afford as a company to go out and put an algorithm that you think, or an autonomous system that you think is the best idea, and then land up on the first page of the newspaper. Because what that does is it degrades the trustworthiness by your consumers of your product.

And so what I tell, you know, both sides is that I think it’s worth a conversation where we have certain guardrails when it comes to facial recognition technology, because we don’t have the technical accuracy when it applies to all populations. When it comes to disparate impact on financial products and services.There are great models that I’ve found in my work, in the banking industry, where they actually have triggers because they have regulatory bodies that help them understand what proxies actually deliver disparate impact. There are areas that we just saw this right in the housing and appraisal market, where AI is being used to sort of, um, replace a subjective decision making, but contributing more to the type of discrimination and predatory appraisals that we see. There are certain cases that we actually need policy makers to impose guardrails, but more so be proactive. I tell policymakers all the time, you can’t blame data scientists. If the data is horrible.

Anthony Green: Right.

Nicol Turner Lee: Put more money in R and D. Help us create better data sets that are overrepresented in certain areas or underrepresented in terms of minority populations. The key thing is, it has to work together. I don’t think that we’ll have a good winning solution if policy makers actually, you know, lead this or data scientists lead it by itself in certain areas. I think you really need people working together and collaborating on what those principles are. We create these models. Computers don’t. We know what we’re doing with these models when we’re creating algorithms or autonomous systems or ad targeting. We know! We in this room, we cannot sit back and say, we don’t understand why we use these technologies. We know because they actually have a precedent for how they’ve been expanded in our society, but we need some accountability. And that’s really what I’m trying to get at. Who’s making us accountable for these systems that we’re creating?

It’s so interesting, Anthony, these last few, uh, weeks, as many of us have watched the, uh, conflict in Ukraine. My daughter, because I have a 15 year old, has come to me with a variety of TikToks and other things that she’s seen to sort of say, “Hey mom, did you know that this is happening?” And I’ve had to sort of pull myself back cause I’ve gotten really involved in the conversation, not knowing that in some ways, once I go down that path with her. I’m going deeper and deeper and deeper into that well.

Anthony Green: Yeah.

Continue Reading

Tech

A bioengineered cornea can restore sight to blind people

Published

on

A bioengineered cornea can restore sight to blind people


One unexpected bonus was that the implant changed the shape of the cornea enough for its recipients to wear contact lenses for the best possible vision, even though they had been previously unable to tolerate them.

The cornea helps focus light rays on the retina at the back of the eye and protects the eye from dirt and germs. When damaged by infection or injury, it can prevent light from reaching the retina, making it difficult to see.

Corneal blindness is a big problem: around 12.7 million people are estimated to be affected by the condition, and cases are rising at a rate of around a million each year. Iran, India, China, and various countries in Africa have particularly high levels of corneal blindness, and specifically keratoconus.

Because pig skin is a by-product of the food industry, using this bioengineered implant should cost fraction as much as transplanting a human donor cornea, said Neil Lagali, a professor at the Department of Biomedical and Clinical Sciences at Linköping University, one of the researchers behind the study.

“It will be affordable, even to people in low-income countries,” he said. “There’s a much bigger cost saving compared to the way traditional corneal transplantation is being done today.”

The team is hoping to run a larger clinical trial of at least 100 patients in Europe and the US. In the meantime, they plan to kick-start the regulatory process required for the US Food and Drug Administration to eventually approve the device for the market.

Continue Reading

Copyright © 2021 Vitamin Patches Online.