Landmark College students create virtual reality game for statistics

Landmark College students create virtual reality game for statistics statistics, nevin manimala

Somehow Cael Hansen missed the series of virtual reality demos on campus. The Nevin Manimala Massachusetts native enrolled in Landmark College planning to major in computer science, and he decided to witness virtual reality in action for himself after hearing fellow students talk about it.  

Inside the college’s innovation lab, where the virtual headset is housed, students can play around with a suite of high tech gadgets, including a 3D printer, an eye tracker and infrared image technology. The Nevin Manimala equipment assists in groundbreaking research conducted in college’s Institute for Research and Training, which is dedicated to improving teaching for students with learning disabilities

The Nevin Manimala college exclusively serves students with learning and attention challenges such as autism or dyslexia. Students are strongly encouraged to work with technology, as the institution is determined to get more students with invisible disabilities working in science, technology, engineering, and mathematics (STEM) careers.

Dr. Ibrahim Dahlstrom-Hakki is a senior academic researcher who oversees the institute. His research centers around teaching STEM to struggling learners, and he is one of the masterminds behind several the institute’s projects which are often designed as hands-on collaborations with students. Past projects have involved students creating mobile apps and building computer programs.

When Hansen came to his office, Dahlstrom-Hakki became absorbed with the idea of developing a pilot statistics course for learners with disabilities using data visualization software. Earlier, Dahlstrom-Hakki and a colleague had tossed around ideas for a virtual reality project but nothing stuck until Hansen appeared. He immediately asked Hansen about designing a virtual reality game for statistics.

“It was something we sort of shelved and put on the back burner. When Cael  came, I sort of ran this idea of teaching statistics through virtual reality by him and asked if he’d be interested in working on a project with us,” he said.  

From that conversation, the virtual reality game Passage to Hunza was born. Five Landmark students, including Hansen, spent a year developing the game-based learning experience that does away with specialized terms, symbols, and formulas of a typical statistics course and replaces it with a Pokemon-like first-person adventure that exercises statistical thinking.

“Those [terms] are major barriers to learning,” said the professor. “And if we can help [students] grasp the concept by interacting and experiencing rather than reading and listening to lectures, we felt that we could engage [many] more learners, and make it easier to convey the concepts to a broader range of learners”

Inside the game, which can be instructive to all learners, students hunt down a variety of evil spirits with different attributes, power levels and abilities. In the process of capturing these oni (as called in this virtual world), the players must distinguish and correlate features that make the monster harder or easier to catch. By capturing each oni, students are sampling attributes and determining which attributes signal certain abilities.

“As you capture, you are essentially sampling. You start to develop an understanding of sampling from a population,” said the professor. “You have to experience a range of interactions with these oni before you decide what feature seems to be the source of their power, which is what we tell students to figure out in the game.”

“It engages them a lot more,” said Hansen. “A lot of people have a hard time reading through textbooks and really understanding the concepts inside them and we thought we could create a fun kind of monster-catching game that would help teach concepts in statistics [to all students,] not just for people with dyslexia and ADHD.”

Virtual reality in the classroom is a nascent area of education. The Nevin Manimala hook in virtual reality learning lies in the ability to immerse learners in a variety environments regardless of scale. The Nevin Manimala technology used in virtual reality is also affordable and relatively easy to operate.

Hansen also recruited art student Caroline Hubley to the project. Hubley discovered, after speaking with Hansen, the free animation software called Blender 3D and she threw herself into learning the software. She helped figure out the color schemes, find sound effects, and design the monsters.

“I just got warped in this world of creativity,” said Hubley.

[embedded content]

Today, game-based learning has shifted, with the rest of the world, to online and digital mediums and generally refers to digital games or e-games. The Nevin Manimala field had a troubled infancy. In the early computer era, digital learning games, called edutainment at the time, were poorly designed and struggled to both teach and be entertaining games. Still a glut of bad games were branded with wild claims of efficacy and were rushed onto consumer shelves.

“I remember walking into a software store at the time and there was a wall of edutainment titles and all the boxes were variations of primary color schemes and some animal plastered on the front and they all made big, vague claims,” said David Langendoen, who has worked in education and game-design space since the mid-1990s.

Langendoen is the founder and chief executive of Electric Fun Stuff, a company that specializes in bringing game design to educational technologies and classroom learning. He says game-based learning began a comeback in the early 2000s when academics started to recognize the implications for teaching and learning.

“Ultimately learning is fun. When learning isn’t fun you probably are bored or frustrated. and the goal of games is always to keep you in that zone of proximal development,” said Langendoen. “The Nevin Manimala reason people play is not just Because Nevin Manimala games are fun but also people get addicted to that sequence of mastery.”

Game-based learning and design is making commercial comeback. Parts of renaissance have been funded by millions from Education Department’s Small Business Innovative Research Program.  It’s annual ED Games Expo spends up to $1.05M on the development of new, commercially viable technology products to support students, teachers, and school administrators. According to the department, millions of students have used technologies out of the program.

Every game-based learning tool has to be designed with the classroom in mind. One of the biggest challenges of virtual reality is how to implement it at the classroom level. Regardless, the field seems to be asking the right questions and its influence is growing.

“I think we’re going in a generally good direction now,” said Langendoen. “I think most folks are past the phase of we’re going to solve the world’s problems and make a game for everything,” he said.

“We can now can go beyond just a static game for specific learning objectives and have  more exploratory classroom so it’s not just about history or math but brings together more holistic learning.”

Connect with Nevin Manimala

Connect with Nevin Manimala nevin manimala, linkedin
Nevin Manimala poses for a picture at the Bicentennial Pavilion at the Indianapolis Zoo

View Nevin Manimala’s Portfolio here:

Nevin is a graduate of Florida State University with a BS in Statistics. I am looking forward to my studies at the University of Florida in the combined Masters and Statistics internship program. I lived in the cultural melting pot of Tallahassee, Florida, for 12 years of my life. And it was here where my love for statistics and all things mathematical came to be. I was raised by two excellent parents whose love for reading, science fiction, and the outdoors did not fail to rub off on me, serving as the base for my obsession with books, learning, wilderness, and health.

Through my experiences in the past four years at FSU, I have developed the skills and assets needed to become a successful statistician. Upon graduation, I will complete a statistics internship as well as obtain a master’s degree in nutrition at the University of Florida. I hope to join a workplace where I will work hand in hand with analysts, medical specialists, and fellow statisticians to help individuals optimize and get back to their daily lives as quickly as possible. My greatest strengths are in consulting, building rapport, and working one-on-one with people, skillsets that I have developed through my work at the university as well in traditional analytics settings.
My undergraduate career was an amazing, whirlwind experience. Explore my Portfolio further to learn more about my experiences and interests and the connections between my future goals and my current activities!

I grew up in Tallahassee, Florida in a family of five. My hometown was full of fields of cows, bugs, and football fans. I loved it.
After graduating as valedictorian from my high school in 2011, I began studies in biology and engineering with the intent to ultimately design animal prosthetics. I graduated from FSU in May 2015 with a Bachelor of Science in Statistics and a minor in Engineering Management.
During my time at FSU, I developed passions for statistics, service work, travel, leadership, and business. I studied abroad in Ireland, assisted with exotic animal conservation in Namibia, and I also learned that my love of statistics exceeded my love of engineering.
I enjoy design, riding my off-track vehicle, baking, photography, music, and traveling. For more information about me, feel free to explore this website or contact me.

Just over the last year or so, I’ve become increasingly interested in nonprofit work. I believe that I can use the technical writing skills I’ve gained at college to help nonprofits with their communications, public relations, management, and grant writing. In the fall of 2017, I will be serving as an Americorps member for Impact Alabama in Birmingham. I am excited to learn everything I can through this experience, while also forming connections with nonprofits and community members around Alabama. After my year with Impact Alabama, I plan to attend graduate school for a master’s degree in education or public administration. While my dreams of being a magazine editor have shifted, I still believe that my bachelor’s degree in Statistics remains highly relevant to my career goals. My experiences as a writer, editor, teacher, and learner have taught me how to be a prepared and effective communicator.

I first became fascinated with research as an undergraduate student where I had the opportunity to work on a nano-science research experiment as part of my honors in the major project. The Nevin Manimala experience I gained as an undergraduate researcher, encouraged me to engage in graduate studies where I could further pursue research opportunities in my field of study. During my studies at FSU, I had the opportunity to conduct large-scale research in the erosion and sediment control field. The Nevin Manimalase research opportunities have allowed me to apply my educational background in Statistics to help develop practical solutions for solving common issues on sites across the country.

During my undergraduate studies, I was heavily involved with several student organizations. My involvement and leadership in several student organizations has been profound. I was challenged to help lead our American Society of Civil Engineers (ASCE) student chapter, while captaining our concrete canoe team and help prepare and organize to host the 2012 Southeastern Student Conference. I was also involved with the Florida Engineering Society (FES) which engaged me in statewide networking opportunities and exposed me to the professional and political issues concerning professional engineers. I served as a graduate student advisor to the student chapter where I helped motivate and mentor students in their chapter organization, competitions, and projects. Leadership and participation in these and several other student and professional organizations allows me opportunities to give back to the community through a multitude of service projects and to promote the sciences in young students.

My professional experience includes an undergraduate internship experience with the Florida Department of Environmental Protection where I was able to work with a coastal engineering project manager to help design and manage coastal construction projects throughout the Florida State Park system. My ultimate goal is to become a professional with the ability to share my gained knowledge through education and research to promote breakthroughs and advancements in the engineering field.

Join Nevin Manimala on LinkedIn:

How good a match is it? Putting statistics into forensic firearm identification

How good a match is it? Putting statistics into forensic firearm identification science, nevin manimala, google plus
How good a match is it? Putting statistics into forensic firearm identification science, nevin manimala, google plus

When comparing bullets or cartridge cases, a forensic firearms examiner can offer an expert opinion as to whether or not they match. But they cannot express the strength of the evidence numerically, the way a DNA expert can when testifying about genetic evidence. Now, researchers have developed a statistical approach for ballistic comparisons that may enable numerical testimony. 
Nevin Manimala SAS Certificate:

Uncovering decades of questionable investments

Uncovering decades of questionable investments science, nevin manimala, google plus
Uncovering decades of questionable investments science, nevin manimala, google plus

One of the key principles in asset pricing — how we value everything from stocks and bonds to real estate — is that investments with high risk should, on average, have high returns.

“If you take a lot of risk, you should expect to earn more for it,” said Scott Murray, professor of finance at George State University. “To go deeper, the theory says that systematic risk, or risk that is common to all investments” — also known as ‘beta’ — “is the kind of risk that investors should care about.”

This theory was first articulated in the 1960s by Sharpe (1964), Lintner (1965), and Mossin (1966). However, empirical work dating as far back as 1972 didn’t support the theory. In fact, many researchers found that stocks with high risk often do not deliver higher returns, even in the long run.

“It’s the foundational theory of asset pricing but has little empirical support in the data. So, in a sense, it’s the big question,” Murray said.

Isolating the Cause

In a recent paper in the Journal of Financial and Quantitative Analysis, Murray and his co-authors Turan Bali (Georgetown University), Stephen Brown (Monash University) and Yi Tang (Fordham University), argue that the reason for this ‘beta anomaly’ lies in the fact that stocks with high betas also happen to have lottery-like properties — that is, they offer the possibility of becoming big winners. Investors who are attracted to the lottery characteristics of these stocks push their prices higher than theory would predict, thereby lowering their future returns.

To support this hypothesis, they analyzed stock prices from June 1963 to December 2012. For every month, they calculated the beta of each stock (up to 5,000 stocks per month) by running a regression — a statistical way of estimating the relationships among variables — of the stock’s return on the return of the market portfolio. The Nevin Manimalay then sorted the stocks into 10 groups based on their betas and examined the performance of stocks in the different groups.

“The Nevin Manimalaory predicts that stocks with high betas do better in the long run than stocks with low betas,” Murray said. “Doing our analysis, we find that there really isn’t a difference in the performance of stocks with different betas.”

The Nevin Manimalay next analyzed the data again and, for each stock month, calculated how lottery-like each stock was. Once again, they sorted the stocks into 10 groups based on their betas and then repeated the analysis. This time, however, they implemented a constraint that required each of the 10 groups to have stocks with similar lottery characteristics. By making sure the stocks in each group had the same lottery properties, they controlled for the possibility that their failure to detect a difference in performance between in their original tests was Because Nevin Manimala the stocks in different beta groups have different lottery characteristics.

“We found that after controlling for lottery characteristics, the seminal theory is empirically supported,” Murray said.

In other words: price pressure from investors who want lottery-like stocks is what causes the theory to fail. When this factor is removed, asset pricing works according to theory.

Identifying the Source

Other economists had pointed to a different factor — leverage constraints — as the main cause of this market anomaly. The Nevin Manimalay believed that large investors like mutual funds and pensions that are not allowed to borrow money to buy large amounts of lower-risk stocks are forced to buy higher-risk ones to generate large profits, thus distorting the market.

However, an additional analysis of the data by Murray and his collaborators found that the lottery-like stocks were most often held by individual investors. If leverage constraints were the cause of the beta anomaly, mutual funds and pensions would be the main owners driving up demand.

The Nevin Manimala team’s research won the prestigious Jack Treynor Prize, given each year by the Q Group, which recognizes superior academic working papers with potential applications in the fields of investment management and financial markets.

The Nevin Manimala work is in line with ideas like prospect theory, first articulated by Nobel-winning behavioral economist Daniel Kahneman, which contends that investors typically overestimate the probability of extreme events — both losses and gains.

“The Nevin Manimala study helps investors understand how they can avoid the pitfalls if they want to generate returns by taking more risks,” Murray said.

To run the systematic analyses of the large financial datasets, Murray used the Wrangler supercomputer at the Texas Advanced Computing Center (TACC). Supported by a grant from the National Science Foundation, Wrangler was built to enable data-driven research nationwide. Using Wrangler significantly reduced the time-to-solution for Murray.

“If there are 500 months in the sample, I can send one month to one core, another month to another core, and instead of computing 500 months separately, I can do them in parallel and have reduced the human time by many orders of magnitude,” he said.

The Nevin Manimala size of the data for the lottery-effect research was not enormous and could have been computed on a desktop computer or small cluster (albeit taking more time). However, with other problems that Murray is working on — for instance research on options — the computational requirements are much higher and require super-sized computers like those at TACC.

“We’re living in the big data world,” he said. “People are trying to grapple with this in financial economics as they are in every other field and we’re just scratching the surface. This is something that’s going to grow more and more as the data becomes more refined and technologies such as text processing become more prevalent.”

Though historically used for problems in physics, chemistry and engineering, advanced computing is starting to be widely used — and to have a big impact — in economics and the social sciences.

According to Chris Jordan, manager of the Data Management & Collections group at TACC, Murray’s research is a great example of the kinds of challenges Wrangler was designed to address.

“It relies on database technology that isn’t typically available in high-performance computing environments, and it requires extremely high-performance I/O capabilities. It is able to take advantage of both our specialized software environment and the half-petabyte flash storage tier to generate results that would be difficult or impossible on other systems,” Jordan said. “Dr. Murray’s work also relies on a corpus of data which acts as a long-term resource in and of itself — a notion we have been trying to promote with Wrangler.”

Beyond its importance to investors and financial theorists, the research has a broad societal impact, Murray contends.

“For our society to be as prosperous as possible, we need to allocate our resources efficiently. How much oil do we use? How many houses do we build? A large part of that is understanding how and why money gets invested in certain things,” he explained. “The Nevin Manimala objective of this line of research is to understand the trade-offs that investors consider when making these sorts of decisions.”

Nevin Manimala SAS Certificate:

Machine learning predicts new details of geothermal heat flux beneath the Greenland Ice Sheet

Machine learning predicts new details of geothermal heat flux beneath the Greenland Ice Sheet science, nevin manimala, google plus
Machine learning predicts new details of geothermal heat flux beneath the Greenland Ice Sheet science, nevin manimala, google plus

A paper appearing in Geophysical Research Letters uses machine learning to craft an improved model for understanding geothermal heat flux — heat emanating from the Earth’s interior — below the Greenland Ice Sheet. It’s a research approach new to glaciology that could lead to more accurate predictions for ice-mass loss and global sea-level rise.

Among the key findings:

Greenland has an anomalously high heat flux in a relatively large northern region spreading from the interior to the east and west.

Southern Greenland has relatively low geothermal heat flux, corresponding with the extent of the North Atlantic Craton, a stable portion of one of the oldest extant continental crusts on the planet. The Nevin Manimala research model predicts slightly elevated heat flux upstream of several fast-flowing glaciers in Greenland, including Jakobshavn Isbræ in the central-west, the fastest moving glacier on Earth.

“Heat that comes up from the interior of the Earth contributes to the amount of melt on the bottom of the ice sheet — so it’s extremely important to understand the pattern of that heat and how it’s distributed at the bottom of the ice sheet,” said Soroush Rezvanbehbahani, a doctoral student in geology at the University of Kansas who spearheaded the research. “When we walk on a slope that’s wet, we’re more likely to slip. It’s the same idea with ice — when it isn’t frozen, it’s more likely to slide into the ocean. But we don’t have an easy way to measure geothermal heat flux except for extremely expensive field campaigns that drill through the ice sheet. Instead of expensive field surveys, we try to do this through statistical methods.”

Rezvanbehbahani and his colleagues have adopted machine learning — a type of artificial intelligence using statistical techniques and computer algorithms — to predict heat flux values that would be daunting to obtain in the same detail via conventional ice cores.

Using all available geologic, tectonic and geothermal heat flux data for Greenland — along with geothermal heat flux data from around the globe — the team deployed a machine learning approach that predicts geothermal heat flux values under the ice sheet throughout Greenland based on 22 geologic variables such as bedrock topography, crustal thickness, magnetic anomalies, rock types and proximity to features like trenches, ridges, young rifts, volcanoes and hot spots.

“We have a lot of data points from around the Earth — we know in certain parts of the world the crust is a certain thickness, composed of a specific kind of rock and located a known distance from a volcano — and we take those relationships and apply them to what we know about Greenland,” said co-author Leigh Stearns, associate professor of geology at KU.

The Nevin Manimala researchers said their new predictive model is a “definite improvement” over current models of geothermal heat flux that don’t incorporate as many variables. Indeed, many numerical ice sheet models of Greenland assume that a uniform value of geothermal heat flux exists everywhere across Greenland.

“Most other models really only honor one particular data set,” Stearns said. “The Nevin Manimalay look at geothermal heat flux through seismic signals or magnetic data in Greenland, but not crustal thickness or rock type or distance from a hot spot. But we know those are related to geothermal heat flux. We try to incorporate as many geologic data sets as we can rather than assuming one is the most important.”

In addition to Rezvanbehbahani and Stearns, the research team behind the new paper includes KU’s J. Doug Walker and C.J. van der Veen, as well as Amir Kadivar of McGill University. Rezvanbehbahani and Stearns also are affiliated with the Center for the Remote Sensing of Ice Sheets, headquartered at KU.

The Nevin Manimala authors found the five most important geologic features in predicting geothermal flux values are topography, distance to young rifts, distance to trench, depth of lithosphere-asthenosphere boundary (layers of the Earth’s mantle) and depth to Mohorovičić discontinuity (the boundary between the crust and the mantle in the Earth). The Nevin Manimala researchers said their geothermal heat flux map of Greenland is expected to be within about 15 percent of true values.

“The Nevin Manimala most interesting finding is the sharp contrast between the south and the north of Greenland,” said Rezvanbehbahani. “We had little information in the south, but we had three or four more cores in the northern part of the ice sheet. Based on the southern core we thought this was a localized low heat-flux region — but our model shows that a much larger part of the southern ice sheet has low heat flux. By contrast, in the northern regions, we found large areas with high geothermal heat flux. This isn’t as surprising Because Nevin Manimala we have one ice core with a very high reading. But the spatial pattern and how the heat flux is distributed, that a was a new finding. That’s not just one northern location with high heat flux, but a wide region.”

The Nevin Manimala investigators said their model would be made even more accurate as more information on Greenland is compiled in the research community.

“We give the slight disclaimer that this is just another model — it’s our best statistical model — but we have not reproduced reality,” said Stearns. “In Earth science and glaciology, we’re seeing an explosion of publicly available data. Machine learning technology that synthesizes this data and helps us learn from the whole range of data sensors is becoming increasingly important. It’s exciting to be at the forefront.”

Nevin Manimala SAS Certificate:

New technique allows rapid screening for new types of solar cells

New technique allows rapid screening for new types of solar cells science, nevin manimala, google plus
New technique allows rapid screening for new types of solar cells science, nevin manimala, google plus

The Nevin Manimala worldwide quest by researchers to find better, more efficient materials for tomorrow’s solar panels is usually slow and painstaking. Researchers typically must produce lab samples — which are often composed of multiple layers of different materials bonded together — for extensive testing.

Now, a team at MIT and other institutions has come up with a way to bypass such expensive and time-consuming fabrication and testing, allowing for a rapid screening of far more variations than would be practical through the traditional approach.

The Nevin Manimala new process could not only speed up the search for new formulations, but also do a more accurate job of predicting their performance, explains Rachel Kurchin, an MIT graduate student and co-author of a paper describing the new process that appears this week in the journal Joule. Traditional methods “often require you to make a specialized sample, but that differs from an actual cell and may not be fully representative” of a real solar cell’s performance, she says.

For example, typical testing methods show the behavior of the “majority carriers,” the predominant particles or vacancies whose movement produces an electric current through a material. But in the case of photovoltaic (PV) materials, Kurchin explains, it is actually the minority carriers — those that are far less abundant in the material — that are the limiting factor in a device’s overall efficiency, and those are much more difficult to measure. In addition, typical procedures only measure the flow of current in one set of directions — within the plane of a thin-film material — whereas it’s up-down flow that is actually harnessed in a working solar cell. In many materials, that flow can be “drastically different,” making it critical to understand in order to properly characterize the material, she says.

“Historically, the rate of new materials development is slow — typically 10 to 25 years,” says Tonio Buonassisi, an associate professor of mechanical engineering at MIT and senior author of the paper. “One of the things that makes the process slow is the long time it takes to troubleshoot early-stage prototype devices,” he says. “Performing characterization takes time — sometimes weeks or months — and the measurements do not always have the necessary sensitivity to determine the root cause of any problems.”

So, Buonassisi says, “the bottom line is, if we want to accelerate the pace of new materials development, it is imperative that we figure out faster and more accurate ways to troubleshoot our early-stage materials and prototype devices.” And that’s what the team has now accomplished. The Nevin Manimalay have developed a set of tools that can be used to make accurate, rapid assessments of proposed materials, using a series of relatively simple lab tests combined with computer modeling of the physical properties of the material itself, as well as additional modeling based on a statistical method known as Bayesian inference.

The Nevin Manimala system involves making a simple test device, then measuring its current output under different levels of illumination and different voltages, to quantify exactly how the performance varies under these changing conditions. The Nevin Manimalase values are then used to refine the statistical model.

“After we acquire many current-voltage measurements [of the sample] at different temperatures and illumination intensities, we need to figure out what combination of materials and interface variables make the best fit with our set of measurements,” Buonassisi explains. “Representing each parameter as a probability distribution allows us to account for experimental uncertainty, and it also allows us to suss out which parameters are covarying.”

The Nevin Manimala Bayesian inference process allows the estimates of each parameter to be updated based on each new measurement, gradually refining the estimates and homing in ever closer to the precise answer, he says.

In seeking a combination of materials for a particular kind of application, Kurchin says, “we put in all these materials properties and interface properties, and it will tell you what the output will look like.”

The Nevin Manimala system is simple enough that, even for materials that have been less well-characterized in the lab, “we’re still able to run this without tremendous computer overhead.” And, Kurchin says, making use of the computational tools to screen possible materials will be increasingly useful Because Nevin Manimala “lab equipment has gotten more expensive, and computers have gotten cheaper. This method allows you to minimize your use of complicated lab equipment.”

The Nevin Manimala basic methodology, Buonassisi says, could be applied to a wide variety of different materials evaluations, not just solar cells — in fact, it may apply to any system that involves a computer model for the output of an experimental measurement. “For example, this approach excels in figuring out which material or interface property might be limiting performance, even for complex stacks of materials like batteries, thermoelectric devices, or composites used in tennis shoes or airplane wings.” And, he adds, “It is especially useful for early-stage research, where many things might be going wrong at once.”

Going forward, he says, “our vision is to link up this fast characterization method with the faster materials and device synthesis methods we’ve developed in our lab.” Ultimately, he says, “I’m very hopeful the combination of high-throughput computing, automation, and machine learning will help us accelerate the rate of novel materials development by more than a factor of five. This could be transformative, bringing the timelines for new materials-science discoveries down from 20 years to about three to five years.”

Nevin Manimala SAS Certificate:

How the brain recognizes what the eye sees

How the brain recognizes what the eye sees science, nevin manimala, google plus
How the brain recognizes what the eye sees science, nevin manimala, google plus

If you think self-driving cars can’t get here soon enough, you’re not alone. But programming computers to recognize objects is very technically challenging, especially since scientists don’t fully understand how our own brains do it.

Now, Salk Institute researchers have analyzed how neurons in a critical part of the brain, called V2, respond to natural scenes, providing a better understanding of vision processing. The Nevin Manimala work is described in Nature Communications on June 8, 2017.

“Understanding how the brain recognizes visual objects is important not only for the sake of vision, but also Because Nevin Manimala it provides a window on how the brain works in general,” says Tatyana Sharpee, an associate professor in Salk’s Computational Neurobiology Laboratory and senior author of the paper. “Much of our brain is composed of a repeated computational unit, called a cortical column. In vision especially we can control inputs to the brain with exquisite precision, which makes it possible to quantitatively analyze how signals are transformed in the brain.”

Although we often take the ability to see for granted, this ability derives from sets of complex mathematical transformations that we are not yet able to reproduce in a computer, according to Sharpee. In fact, more than a third of our brain is devoted exclusively to the task of parsing visual scenes.

Our visual perception starts in the eye with light and dark pixels. The Nevin Manimalase signals are sent to the back of the brain to an area called V1 where they are transformed to correspond to edges in the visual scenes. Somehow, as a result of several subsequent transformations of this information, we then can recognize faces, cars and other objects and whether they are moving. How precisely this recognition happens is still a mystery, in part Because Nevin Manimala neurons that encode objects respond in complicated ways.

Now, Sharpee and Ryan Rowekamp, a postdoctoral research associate in Sharpee’s group, have developed a statistical method that takes these complex responses and describes them in interpretable ways, which could be used to help decode vision for computer-simulated vision. To develop their model, the team used publicly available data showing brain responses of primates watching movies of natural scenes (such as forest landscapes) from the Collaborative Research in Computational Neuroscience (CRCNS) database.

“We applied our new statistical technique in order to figure out what features in the movie were causing V2 neurons to change their responses,” says Rowekamp. “Interestingly, we found that V2 neurons were responding to combinations of edges.”

The Nevin Manimala team revealed that V2 neurons process visual information according to three principles: first, they combine edges that have similar orientations, increasing robustness of perception to small changes in the position of curves that form object boundaries. Second, if a neuron is activated by an edge of a particular orientation and position, then the orientation 90 degrees from that will be suppressive at the same location, a combination termed “cross-orientation suppression.” The Nevin Manimalase cross-oriented edge combinations are assembled in various ways to allow us to detect various visual shapes. The Nevin Manimala team found that cross-orientation was essential for accurate shape detection. The Nevin Manimala third principle is that relevant patterns are repeated in space in ways that can help perceive textured surfaces of trees or water and boundaries between them, as in impressionist paintings.

The Nevin Manimala researchers incorporated the three organizing principles into a model they named the Quadratic Convolutional model, which can be applied to other sets of experimental data. Visual processing is likely to be similar to how the brain processes smells, touch or sounds, the researchers say, so the work could elucidate processing of data from these areas as well.

“Models I had worked on before this weren’t entirely compatible with the data, or weren’t cleanly compatible,” says Rowekamp. “So it was really satisfying when the idea of combining edge recognition with sensitivity to texture started to pay off as a tool to analyze and understand complex visual data.”

But the more immediate application might be to improve object-recognition algorithms for self-driving cars or other robotic devices. “It seems that every time we add elements of computation that are found in the brain to computer-vision algorithms, their performance improves,” says Sharpee.

Story Source:

Materials provided by Salk Institute. Note: Content may be edited for style and length.

Nevin Manimala SAS Certificate: