Trending February 2024 # 11 Amazing Applications / Uses Of Data Science Today # Suggested March 2024 # Top 5 Popular

You are reading the article 11 Amazing Applications / Uses Of Data Science Today updated in February 2024 on the website Eastwest.edu.vn. We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested March 2024 11 Amazing Applications / Uses Of Data Science Today

Introduction

In today’s data-driven world, the power of data science is unbeliveable. From uncovering hidden patterns to predicting future trends, data science is revolutionizing industries. In this article, we will explore the fascinating and diverse uses of data science, unlocking its potential to transform businesses, healthcare, finance, and more. Get ready to discover the limitless possibilities of harnessing data science for innovation and success.

Learning Objective

Common data science uses

Advanced applications

Data science tools and techniques

Future impact of data science

Top 11 Uses of Data Science

Using data science, companies have become intelligent enough to push & sell products as per customers purchasing power & interest. Here’s how they are ruling our hearts and minds:

1. Internet Search

When we speak of search, we think ‘Google’. Right? But there are many other search engines like Yahoo, Bing, Ask, AOL, Duckduckgo etc. All these search engines (including Google) make use of data science algorithms to deliver the best result for our searched query in fraction of seconds. Considering the fact that, Google processes more than 20 petabytes of data everyday. Had there been no data science, Google wouldn’t have been the ‘Google’ we know today.

2. Digital Advertisements (Targeted Advertising and re-targeting)

If you thought Search would have been the biggest application of data science and machine learning, here is a challenger – the entire digital marketing spectrum. Starting from the display banners on various websites to the digital bill boards at the airports – almost all of them are decided by using data science algorithms.

3. Recommender Systems

Who can forget the suggestions about similar products on Amazon? They not only help you find relevant products from billions of products available with them, but also adds a lot to the user experience.

A lot of companies have fervidly used this engine / system to promote their products / suggestions in accordance with user’s interest and relevance of information. Internet giants like Amazon, Twitter, Google Play, Netflix, Linkedin, imdb and many more uses this system to improve user experience. The recommendations are made based on previous search results for a user.

4. Image Recognition

You upload your image with friends on Facebook and you start getting suggestions to tag your friends. This automatic tag suggestion feature uses face recognition algorithm. Similarly, while using whatsapp web, you scan a barcode in your web browser using your mobile phone. In addition, Google provides you the option to search for images by uploading them. It uses image recognition and provides related search results. To know more about image recognition, check out this amazing (1:31) mins video:

5. Speech Recognition

Some of the best example of speech recognition products are Google Voice, Siri, Cortana etc. Using speech recognition feature, even if you aren’t in a position to type a message, your life wouldn’t stop. Simply speak out the message and it will be converted to text. However, at times, you would realize, speech recognition doesn’t perform accurately. Just for laugh, check out this hilarious video(1:30 mins) and the conversation between Cortana & Satya Nadela (CEO, Microsoft)

6. Gaming

EA Sports, Zynga, Sony, Nintendo, Activision-Blizzard have led gaming experience to the next level using data science. Games are now designed using machine learning algorithms which improve / upgrade themselves as the player moves up to a higher level. In motion gaming also, your opponent (computer) analyzes your previous moves and accordingly shapes up its game.

7. Price Comparison Websites

At a basic level, these websites are being driven by lots and lots of data which is fetched using APIs and RSS Feeds. If you have ever used these websites, you would know, the convenience of comparing the price of a product from multiple vendors at one place. PriceGrabber, PriceRunner, Junglee, Shopzilla, DealTime are some examples of price comparison websites. Now a days, price comparison website can be found in almost every domain such as technology, hospitality, automobiles, durables, apparels etc.

8. Airline Route Planning

Airline Industry across the world is known to bear heavy losses. Except a few airline service providers, companies are struggling to maintain their occupancy ratio and operating profits. With high rise in air fuel prices and need to offer heavy discounts to customers has further made the situation worse. It wasn’t for long when airlines companies started using data science to identify the strategic areas of improvements. Now using data science, the airline companies can:

Predict flight delay

Decide which class of airplanes to buy

Whether to directly land at the destination, or take a halt in between (For example: A flight can have a direct route from New Delhi to New York. Alternatively, it can also choose to halt in any country.)

Effectively drive customer loyalty programs

Southwest Airlines, Alaska Airlines are among the top companies who’ve embraced data science to bring changes in their way of working.

7. Fraud and Risk Detection

One of the first applications of data science originated from Finance discipline. Companies were fed up of bad debts and losses every year. However, they had a lot of data which use to get collected during the initial paper work while sanctioning loans. They decided to bring in data science practices in order to rescue them out of losses. Over the years, banking companies learned to divide and conquer data via customer profiling, past expenditures and other essential variables to analyze the probabilities of risk and default. Moreover, it also helped them to push their banking products based on customer’s purchasing power.

8. Delivery Logistics

Who says data science has limited applications? Logistic companies like DHL, FedEx, UPS, Kuhne+Nagel have used data science to improve their operational efficiency. Using data science, these companies have discovered the best routes to ship, the best suited time to deliver, the best mode of transport to choose thus leading to cost efficiency, and many more to mention. Further more, the data that these companies generate using the GPS installed, provides them a lots of possibilities to explore using data science.

9. Miscellaneous

Apart from the applications mentioned above, data science is also used in Marketing, Finance, Human Resources, Health Care, Government Policies and every possible industry where data gets generated. Using data science, the marketing departments of companies decide which products are best for Up selling and cross selling, based on the behavioral data from customers. In addition, predicting the wallet share of a customer, which customer is likely to churn, which customer should be pitched for high value product and many other questions can be easily answered by data science. Finance (Credit Risk, Fraud), Human Resources (which employees are most likely to leave, employees performance, decide employees bonus) and many other tasks are easily accomplished using data science in these disciplines.

10. Self Driving Cars

Check out this ~3 min video to know more:

11. Robots

Check out this ~3 min video of robots which resembles Humans

Imagine a world, where we are surrounded by robots like these. Will they do any good for us? Or these would lead to repercussions that mankind will have to endure. Let’s have a discussion on this !

End Notes

I am sure, you had fun watching the videos. The intent wasn’t just fun, but learning simultaneously. By now, you would developed an understanding of the boundless potential that data science has in this world. Almost everything on this planet, which generates data, falls under the radar of data science which can improve and optimize its existing process.

If you like what you just read & want to continue your analytics learning, subscribe to our emails, follow us on twitter or like our facebook page.

Frequently Asked Questions

Related

You're reading 11 Amazing Applications / Uses Of Data Science Today

Ten Young Geniuses Shaking Up Science Today

Three of the Brilliant Ten

We have a credo around here: The future will be better. It may sound optimistic in light of our wheezing environment and limping economy, but then you haven’t met the Brilliant 10, PopSci’s annual selection of the nation’s most promising young researchers.

They’re 10 powerful reasons to look on the bright side. Take materials scientist Ting Xu. She’s using nanotechnology to craft solar cells that are more energy-efficient and eco-friendly than oil or coal. John Rinn is unlocking the secrets of RNA to keep us healthier, a vital step toward solving our health-care woes. Jerome Lynch is making smart sensors for bridges that spot structural flaws before disaster strikes. And not one of these geniuses is over 40. The world is facing some pretty big problems, we admit, but with these talented minds tackling them, can you blame us for feeling hopeful?

Helping Hands

Dennis Hong created the humanoid CHARLI to better study our own biomechanics.

The Robot Maker Brilliant because: He builds sophisticated robots that don’t just copy biology—they improve on its most elegant and efficient principles

Affiliation: Virginia Tech

In 1977, a six-year-old boy visiting Los Angeles from South Korea saw Star Wars for the first time. He gaped at the curious locomotion of R2-D2 and the human-robot interactions of C-3PO, and as he flew back home, Dennis Hong remembers, “I knew I was going to build robots for the rest of my life.”

Hong was born in California, but when he was three, his father, an aerospace engineer, moved the family to Seoul for a job. Hong lived there until his sophomore year of college, when he transferred to the University of Wisconsin, and went on to grad school at Purdue University. “All of it was mechanical engineering, focused on robotics,” he says.

Today, Hong runs Virginia Tech’s Robotics and Mechanisms Laboratory, which has produced a robotic hand that’s dexterous enough to handle an egg, a pole-climbing snake ‘bot for construction inspections, and a momentum-propelled, three-legged robot, among other projects.

“When I joined VT, people thought robotics should be all about intelligence,” Hong says. Instead, he chose to focus on mechanical systems found in nature. “We’re not copying nature; we’re using its principles,” he explains. The design of the three-legged robot, for instance, looks unnatural, yet it mimics the momentum of the human gait. To move forward, its hub flips over, causing one leg to swing between the other two. The robotic hand is controlled by compressed air, varying the strength of its grip without the use of other motors, in the same way human grip relies on elastic ligaments to help the fingers curl.

His lab’s latest effort is a humanoid called CHARLI, for Cognitive Humanoid Autonomous Robot with Learning Intelligence. It serves as a research platform for the study of human locomotion and a contender in Robocup 2010, a tournament in which robots compete in soccer matches.

Ultimately, Hong hopes to engineer robots that move with the grace and adaptability of humans. The key, he believes, is uninhibited research. In Korea, Hong recalls, “I grew up in an environment of people being afraid or ashamed to speak up. In my lab there’s no criticism, only refinement. You want to put a nuclear reactor in your robot? Fine, let’s pursue that.”

Leading by example, Hong has an organized way of putting his own least-inhibited ideas to use. “Next to my bed, I have a notebook and a pen,” he says. “Every night, I see lines, colorful things in my head. I wake up at 4 a.m., jot down everything. In the morning, I type it into my database of ideas. When funders want this or that, I look for a match.” —Jacob Ward

Rising Star

A few of Marla Geha’s galaxy-hunting tricks: velocity calculations, 3 a.m. e-mails, and superstitious routines to ensure clear skies.

The Star Chaser Brilliant because: She’s discovering nearly invisible galaxies circling our own, and the mysterious dark matter that dominates them

Affiliation: Yale University

Marla Geha has different job titles depending on who’s asking. “If I’m on a plane, I tend to be a physicist,” she says. “Then nobody wants to talk to me.” When she feels the need to impress someone, she’s an astrophysicist. And when she doesn’t mind a two-hour conversation, she tells them she’s an astronomer.

Geha is, in fact, all three. Now a professor at Yale, Geha spends her days (and, of course, nights) trying to identify faint galaxies that probably formed earlier than the Milky Way. Simulations of the Milky Way’s evolution predict that there are about 1,000 such formations. When Geha came on the scene five years ago, astronomers had found just 11 of them. She and others believed that more existed, hidden from view because the galaxies were made mostly of dark matter, the term for whatever it is out there that emits no light but somehow accounts for 90 to 95 percent of the universe’s entire mass.

In the quest to solve the so-called missing-satellite problem, Geha pored over digital maps of the sky, looking for areas with unexpected concentrations of stars. Then she painstakingly measured the velocity of each star. To her amazement, she found that the stars were moving too quickly for their size—tantalizing evidence that dark matter might be tugging on them.

So far, Geha and her team have discovered 14 galaxies. She hopes to find enough to verify the reigning theory of how the universe formed, and perhaps along the way help other fields fully define dark matter. “Astronomers and particle physicists don’t talk to each other much,” she says. In the future, she’ll be the one starting the conversation. —Doug Cantor

Ting Xu

The Energizer Brilliant because: She transforms molecules into mini hard drives with massive storage capacity

Affiliation: University of California, Berkeley

Last fall, Ting Xu, a professor of materials science at the University of California at Berkeley, was suffering from headaches so severe that doctors worried she might have a brain tumor. But one neurologist suggested a simpler cause. How about cutting back on the 16-hour days in the lab, sleeping, and maybe even eating at normal times?

Xu has since eased her work schedule, but she’s no less productive. Earlier this year she co-authored a paper describing a new technique for coaxing tiny polymer strands to self-assemble into 10 trillion cylinders with precise patterns. The method could lead to discs the size of a quarter that store 175 DVDs’—7 terabits—worth of data. Then she tweaked the technique so it could be used to build a range of nanoparticle-based devices—super-efficient photovoltaic cells and energy storage systems, and higher-resolution flexible displays. Xu is smart, diligent and knowledgeable, says polymer physicist Thomas Russell of the University of Massachusetts, but more important,”she has imagination.“

And a youthful one at that. She loves the Transformers. She’s a devotee of Tom and Jerry—watching the warring cat-and-mouse duo helps her think. Like her cartoon heroes, Xu, a native of China, has always been restless. She played volleyball and ran track growing up, but neither wore her out. Her father would offer to boost her allowance if she could sit for more than 15 minutes at a time. He never had to pay, and that energy continues to drive her today.

After reporting on the self-assembly method, which she created with Russell, Xu immediately saw greater potential. The strands, she realized, could serve as minuscule cranes to arrange even smaller building materials and manufacture things like ultrasmall electronic devices and paper-thin, printable solar cells. In her most recent work, Xu combined the self-assembling polymers with nanoscopic particles. By forcing these particles to assume the underlying order of the polymers, she managed to get trillions of them to line up exactly as she wanted.

Adam Wilson

The Mental Messenger Brilliant because: His engineering achievements will let people with disabilities control machines

Last April, Adam Wilson became the first person to send a telepathic message—on the social-networking site Twitter. “using eeg to send tweet,” he wrote, referring to the electroencephalograph he used to record electrical signals in his brain. Wearing a red skullcap embedded with electrodes wired to a computer, he spelled out his missive by focusing on letters flashing before him on a screen.

Beyond extrasensory tweets, Wilson’s deeper ambition for the technology is to help people who have lost the ability to communicate, whether from a stroke or a spinal-cord injury. He’s now developing powerful brain-machine interfaces that attach electrodes to the cerebral cortex, the wrinkled tissue just beneath the skull, where they pick up stronger brain signals than the EEG technique he used in the Twitter experiment. Partly inspired by his fascination with music—Wilson has played the guitar since the seventh grade—his new system taps a brain region that controls response to auditory stimuli, allowing people with neurological disorders to control a computer cursor simply by thinking about the sound of a cellphone ringing.

His next challenge is to engineer seamless wireless systems that could one day decipher complex thoughts—perhaps well enough to help his idol, physicist Stephen Hawking, whose struggle with muscular dystrophy has left him almost fully paralyzed, open doors or steer his wheelchair with thoughts alone. Says Wilson, “I would love to work with him.”_ —Melinda Wenner_

Coloring Outside the Lines

John Rinn at his laboratory in Boston, sketching the mysterious workings of RNA

The Rule Shredder Brilliant because: A dropout skate rat turned ace biologist, he’s proving that “junk” RNA is a potential linchpin of human health

Affiliation: Harvard University/Beth Israel Deaconess Medical Center

John Rinn has a long history of bucking convention. Growing up, skateboarding and snowboarding took precedence over school—he attended four high schools in four years, only graduating because his mother promised him a car. He went to college at the University of Minnesota because it seemed like an excuse to party and hit the slopes. But bedridden with a snowboarding injury after his sophomore year, Rinn had a revelation, inspired by the uncompromising architect Howard Roark in Ayn Rand’s The Fountainhead. “What could I do that I cared that much about?” he asked himself. He began immersing himself in biology classes and realized that he not only had an aptitude for science, but he actually enjoyed it. He pulled mostly A’s and soon discovered the thing that would inspire his future career: RNA.

Science hasn’t dimmed Rinn’s rebellious side. He’s already upending the way biologists think about the human genome. Though similar to DNA, RNA has always been considered DNA’s helper; its best-known job is turning genes into proteins. Some of it was even thought to have no function at all, the equivalent of cellular junk. But in 2003, as a graduate student at Yale, Rinn discovered thousands of new types of RNA, called large intervening non-coding RNAs, or LINCs, and later proved that they play more than just a supporting role in regulating genes—they appear to direct the entire show. At the time, the notion was considered contentious, even ridiculous. “It was the same thing again—’what you’re passionate about is stupid,’ ” Rinn says. “Classic science was not ready for this. Almost nobody was ready for this.”

He silenced his critics in 2007 when he showed that one of the LINCs serves a vital function in human cells. He dubbed it HOTAIR, a wry nod to the fact that so many scientists thought his field of research was full of it. The molecule delivers proteins to a crucial cluster of genes and helps regulate immune response, cancer growth, and fat and stem-cell production, among other things. “If we can unravel their code, we can engineer these molecules to bend the genome to our will,” Rinn says. “That would be a totally new facet for therapeutics and human health.”

High-functioning RNA isn’t his only discovery. In 2006, he answered a long-standing biological question: How do cells know where to go and how to behave? By comparing the genes expressed in cells around the body, he uncovered a kind of genetic ZIP code that orients and redirects cells.

He’s still hunting for LINCs, hopeful that they will reveal cellular secrets. Ultimately, Rinn loves genetics for the same reason he loves snowboarding: “I want to take something old, twist it, and get something new out of it.” —Melinda Wenner

André Platzer

Crash Test Anti-Dummy Brilliant because: His software makes travel on planes, trains and automobiles safer

Affiliation: Carnegie Mellon University

Every now and then, an innovation so vital comes along that it’s hard to imagine how we got along without it. Think seatbelts, antibiotics, fire hoses. Now add André Platzer’s KeYmaera, software that helps computer-controlled safety systems avoid catastrophic errors.

Now a computer scientist at Carnegie Mellon, Platzer grew up in Germany, where he became, of all things, an accomplished ballroom dancer. “I won a few tournaments,” he says. “But I was fascinated with computers, and that began to take up my time.” In 2006, as a professor at the University of Oldenburg in Germany, he began investigating how autopilot systems could fail. When he discovered that there were no models that could test more than a handful of conditions, he built KeYmaera. Prior to it, a collision-avoidance proposal for the Federal Aviation Administration would have told two close planes with intersecting flight paths to each hang a right turn, fly a half circle, and make another right turn to avoid a collision. When KeYmaera tested what would happen to the planes at varying airspeeds, altitudes and trajectories, it found that, in rare cases, the protocol could actually put planes on a collision course. Platzer fed alternative scenarios into KeYmaera until it verified a safer fly-around maneuver. His software has also made potentially lifesaving corrections to models of Europe’s high-speed train systems and adaptive cruise control in cars. “Before you spend $1 billion on a system,” he notes, “it’s good to make sure that it works.” —Bjorn Carey

Multiflasker

When she’s not doing research or training for NASA, Kate Rubins manages her nonprofit, Congo Medical Relief, which she created to deliver medical supplies to Africa.

The Flying Virus Hunter Brilliant because: She uncovers the genetic secrets of deadly viruses, and now she’s taking her science smarts to space as an astronaut

Affiliation: Whitehead Institute, Massachusetts Institute of Technology

As a kid, Kate Rubins dreamed of being an astronaut and figured flying fighter jets would be the best way to get to NASA. She even went to space camp at age 12 to get a head start on her training. Then she learned the disappointing news that, at the time, the pilot job was off-limits to women.

Secretly, her parents hoped their daughter would choose a safer career, but by high school Rubins had already set her sights on another perilous profession: hunting killer viruses. And this time, there was no glass ceiling to hold her back. Rubins published her first paper on HIV in 1999 as an undergraduate at the University of California at San Diego. In 2001, while a Ph.D. candidate at Stanford University, she helped the U.S. Army Medical Research Institute of Infectious Diseases create the first animal model for testing smallpox, a scourge that killed millions before its eradication in 1980. Rubins’s work has made it possible to study how the virus evades the immune system in living tissue, a major step toward new medicine and vaccines should terrorists somehow get their hands on one of the two known smallpox samples. It’s this ability to make positive changes in the world that motivates Rubins. “We have a responsibility as researchers to help people,” she says.

After smallpox, Rubins quickly shifted her attention to another scourge, monkeypox, which is now reaching epidemic proportions in Africa. A cousin to smallpox, the virus is endemic to monkeys and rodents, but it can jump to humans during the slaughter or consumption of bush meat, causing facial boils, blindness and even death. During her tenure as a Whitehead fellow at MIT, Rubins spent months in the remote jungles of the Democratic Republic of Congo, eating the occasional meal of grubs (her motto: “If people serve it, I eat it”), trying to figure out why the disease appears to be spreading so quickly. The region’s underdeveloped health infrastructure makes infection rates hard to pin down, but an uptick in the number of cases suggests the virus is gaining strength.

To track the genetic evolution of monkeypox, Rubins and her team collect and analyze DNA samples from volunteer patients. Because traditional genetic-sequencing techniques can take weeks and often churn out incomplete results, she helped develop a faster, more accurate method. Typically, scientists extract monkeypox from patient samples and grow the virus on human or monkey cells. The problem is that the virus can evolve in response to its growth medium, so the final population of viruses may bear little resemblance to the ones that are infecting people in Africa. Rubins’s idea was to skip the tissue-culture step and instead rely on a new high-powered DNA sequencer to amplify all the genetic material. She then devised laboratory protocols and algorithms to sort the monkeypox from the human cells. The entire process takes less than five days and generates what Rubins calls an “obscene” amount of genetic data on the virus.

Today, the Air Force no longer bars female fighter pilots. The policy changed in 1993, but by then Rubins had already moved on. She’s never been the type to sit around waiting for the tide to turn. This fall, while her team continues its work in Africa, Rubins will finally get the chance to live out that childhood dream when she joins NASA’s 20th astronaut class, training to becoming one of the first people to fly the shuttle’s successor, the Orion [see page 42]. Selected from thousands of candidates, she says her full-throttle hobbies of skydiving and scuba diving, not to mention her ability to thrive in dangerous places, set her apart. When asked if she’s nervous about the prospect of flying a new spaceship to the moon, Rubins smiles calmly. “Not at all. I want to be the first person to fly it, right? I’m just thrilled.” —Nicole Dyer

No Bones About It

Nate Dominy’s research is shining light on the role of food in human evolution.

The Tooth Sleuth Brilliant because: His exploration of ancient eating habits is helping to crack the mystery of human evolution

Affiliation: University of California, Santa Cruz

Nate Dominy found his calling on a college research trip to Costa Rica with his anatomy professors. A football player for Johns Hopkins University, Dominy was assigned the physically demanding task of catching small, drugged monkeys as they fell out of trees. “You have this moving target, completely unconscious, and you have a net in your hand,” he explains. When he went back again the next summer, he found himself thinking about more than just how the monkeys fell, and began helping to decipher the monkeys’ eating habits by studying their teeth. “I got this quick introduction to the importance of food and diet in thinking about the adaptation and behaviors of primates and humans,” he says. “I just loved every minute.”

Ten years after his transformative experience studying food and teeth, Dominy is now a trailblazer. As an associate professor of anthropology at the University of California at Santa Cruz, he works to answer one of anthropology’s biggest questions: How did modern humans evolve from our ape-like ancestors?

Dominy argues that food played a crucial role, and he recently helped solve a decade-long mystery about its role in evolution. In 1999, scientists analyzed the tooth fossils of our three-million-year-old primate ancestors, Australopithecus africanus, for chemical patterns that reveal dietary habits. Their findings suggested that grass, and the animals that ate grass, were a staple meal. But the size and shape of the fossils indicated something quite different—that our ancestors spent more time munching hard, brittle foods, such as highly starchy grass bulbs.

Dominy believes that these caloric veggies may have been the fuel of evolution, delivering enough energy to let us outwit carnivores, invent smarter ways to endure the elements and, eventually, populate the planet. In 2007 he uncovered additional evidence in support of this theory, showing that the teeth of ancient and modern African mole rats that survive entirely on bulbs have identical chemical profiles to our ancestors.

This year, Dominy hopes to crack another mystery: Why are some human populations taller than others? In October he traveled to Uganda to collect DNA from two pygmy tribes, the Twa and the Sua, who are on average less than five feet tall. He believes that short stature could help people navigate dense jungle and stay cooler. No one has ever tested this idea, and when he talks of it, Dominy sounds both excited and slightly incredulous that no one’s jumped on it before. “Body size is central to survival. It affects the kinds of things we eat, how we reproduce, our metabolism,” he says. “Here we are in 2009, and we still don’t know why it varies so much.”_ —Melinda Wenner_

Michael Strano

Master of the Small Brilliant because: He’s tapping the strange powers of nanotechnology to detect cancer

Affiliation: Massachusetts Institute of Technology

Quantum-confined materials derive their power from their small size. For example, a single layer of carbon atoms, known as graphene, behaves nothing like normal carbon. In a conductor such as a copper wire, electrons simply inch along. In graphene, however, electrons move at nearly the speed of light. “It’s like a little particle accelerator,” Strano says. Graphene could make the ultimate solar-panel conductor; it’s highly conductive, highly affordable, and so thin that it’s transparent to light. “It’s the thinnest conductor we can ever imagine,” he says.

He is particularly fascinated by the medical potential of carbon nanotubes. The tiny structures emit near-infrared light that passes harmlessly through human tissue. Injected into cells, they could be used as biological sensors so sensitive they could detect a single molecule of a potentially harmful chemical.

Considering Strano’s big to-do list, it’s a little shocking to learn that he also has three kids under the age of five. Doesn’t he need downtime? “Science pretty much is my hobby,” he says.—Seth Fletcher

Jerome Lynch

The Bridge Whisperer Brilliant because: His bridge sensors can catch structural flaws invisible to human eyes

Affiliation: University of Michigan

Jerry Lynch is proud of his profession. He likes to point out, for instance, that the U.S. has more than 600,000 bridges, and that failures are extremely rare. “We have a very, very good track record,” he says. “We’re a diligent bunch, civil engineers.” But when something does fail, seriously bad things happen—like when the I-35W bridge collapsed in Minneapolis in 2007 and killed 13 people due to faulty gusset plates used to join load-bearing beams. It’s these catastrophic failures that motivate Lynch, an engineering professor at the University of Michigan, to think incessantly about how things come together and how to keep them from coming apart.

His solution to structural failures like the one that befell I-35W bridge is a “sensor skin” that continuously monitors weak spots and alerts inspectors to problems before they become dangerous. “Wouldn’t it be great if we could see big structural failures coming ahead of time?” he says.

Today, the few bridges in the U.S. that have any kind of sensors usually only track seismic activity, largely because it’s so expensive to wire a bridge with enough equipment to monitor multiple threats. “The Golden Gate Bridge is over a mile long,” Lynch says. “The special conduit needed can be $10 a foot, and one sensor can cost thousands.” So instead, engineers typically rely on visual inspections at two-year intervals.

Lynch’s sensors attach to wireless nodes that communicate with other nodes on the bridge, process the data on their own, and relay potential problems back to the local inspector’s office using a cellular data connection. Each sensor consists of polymer sheets up to a foot square and just a few microns thick that cover key structural elements, like the gusset plates that gave way in Minneapolis. At programmed intervals or on command from an inspector, a small microprocessor can send an electric current through the conductive carbon nanotubes embedded in the sheets, while electrodes gauge electrical resistance to detect strain, corrosion, load and dozens of other indications of stress. Hotspots are displayed on a computerized map of the bridge. Lynch doesn’t know yet how much each sensor will cost, but just the fact that they’re wireless will make them cheaper to deploy than today’s sensors and will eliminate the costs associated with unnecessary inspections.

Lynch knows about using time wisely. The Queens, New York, native earned a master’s degree and a Ph.D. in civil engineering from Stanford University and then went back and got another master’s, in electrical engineering. After 9/11, he launched a company to build wireless infrastructure sensors and left it to teach at Michigan, where he was named Professor of the Year his second year on the job. “Dr. Lynch is probably the most regarded scholar among his peers in such an early stage of a career,” says Kincho Law, a professor of structural engineering at Stanford.

Lynch’s sensing skin will leave the lab next year for testing on three highway bridges in Michigan and three bridges in Korea. And he is already working on a paint-based version that could be applied to anything that needs monitoring, from airplanes to pipelines, as well as a version that would make its own power from the vibrations of whatever it’s painted on. “There’s an inherent uncertainty in visual inspections,” Lynch says. “We need better tools to keep an eye on things.” —Mike Haney

Data Science Vs Business Intelligence

Introduction

Hadoop, Data Science, Statistics & others

Data Science vs Business Intelligence: Head-to-Head Comparison (Infographics)

Here are the top 20 comparisons between Data Science vs Business Intelligence:

Data Science vs Business Intelligence: Key Differences

Generic steps followed in Business Intelligence are as follows:

Set a business outcome to improve.

Decide which datasets are most relevant.

Clean and prepare the data.

Design KPIs, reports, and dashboards for better visualization.

Set a business outcome to improve or predict.

Gather all possible and relevant datasets.

Choose an appropriate algorithm to prepare a model.

Evaluate the model for accuracy.

Operationalize the model.

Data Science vs Business Intelligence: Comparison Table

Basis Of Comparison Data Science Business Intelligence

Complexity Higher Simpler

Data Distributed and Real-time Siloed and Warehoused

Role Using statistics and mathematics to uncover hidden patterns, analyze data, and forecast future situations. Focused on organizing datasets, extracting useful information, and presenting it in visualizations such as dashboards.

Technology With cut-throat competition in today’s IT market, companies are striving for innovation and easier solutions for complex business problems. Hence, there is a greater focus on data science than business intelligence. BI is about answering complex business questions through dashboards, providing insights that may not be easily discovered through Excel. BI helps to identify relationships between various variables and time periods. It enables executives to make informed business decisions based on accurate data.

BI does not involve prediction.

Usage Data science helps companies to anticipate future situations, enabling them to mitigate risks and increase revenue. BI helps companies perform root cause analysis to understand the reasons behind a failure or to assess their current situation

Focus Data Science focuses on the future. BI focuses on the past and present.

Career Skill

Data science is the combination of three fields: Statistics, Machine Learning, and Programming.

Until now, most reporting tasks and BI tasks have been conducted through Excel.

Evolution Data science has evolved from Business intelligence. BI has been around for a long time, but previously it was mainly limited to Excel. However, now there are a plethora of tools available in the market that offer better capabilities.

Process Data science leans towards novel experimentation and is dynamic and iterative in nature. Business Intelligence is static in nature with little scope for experimentation. Extraction of data, slight munging of data, and finally dashboarding it.

Flexibility Data Science offers greater flexibility as data sources can be added as per future needs. BI has less flexibility. Data sources need to be pre-planned and adding new sources is a slow process

Business Value Data science brings out better business value than BI, as it focuses on the future scope of the business. Business intelligence has a static process of extracting business value by plotting charts and KPIs, thus showing less business value compared to data science.

Thought Process Data science helps generate questions, which encourages a company to run in a strategic and efficient manner. Business intelligence helps answer already existing questions.

Data Quality Data science involves analyzing data using statistical techniques and evaluating the accuracy, precision, recall value, and probabilities. It instills confidence in the decision-makers. BI provides high-quality dashboarding, but only with good quality data. This means that the data should be sufficient to extract insights from the dataset.

Method Analytic & Scientific Only Analytic

Questions What will happen?

What if?

What happened?

What is happening?

Approach Proactive Reactive

Expertise Role Data scientist Business user

Data Size The tools and technologies are not enough to handle big datasets.

Use cases Not a periodic task. Many of the use cases of BI involve generating and refreshing the standardized dashboards.

Consumption Data science insights are consumed at various levels, from the enterprise level down to the executive level. Business intelligence insights are consumed at the enterprise or department level.

Conclusion

Business intelligence is undoubtedly a beneficial starting point for any industry. However, in the long run, adding a layer of data science will ultimately set it apart. The ability to predict the future by analyzing data is an achievement of data science. Therefore, data science plays a pivotal role and is superior to business intelligence.

Recommended Articles

Here are some further related articles to the subject:

Why Distinction Matters In Big Data And Data Science?

Data has become a resource of interest globally, and harnessing its true potential is becoming important to organizations. According to IBM, 2.5 quintillion bytes of data is created every day. This means that data never sleeps. The increase in data requires the use of different tools and techniques to meaningfully extract insights. Let us first understand how the use of data is defined in the big data and data science industry. Defining data by Work The big data and data science industry terms and definitions overlap and interweave with one another in the analytics field.  However, these are still distinct and are used based on the nature of work. Data science comprises a number of disciplines. These include business intelligence, computer science, data engineering, and statistics, among others. Data science involves processes to collect, clean and analyze both structured and unstructured data. It makes use of the following: + Cleaning raw data to make it ready for analysis. + Finding patterns in the data and helping decision makers in day-to-day business problems. Data science involves discovering hidden patterns within the data through dependencies between different variables. It is used in different industries to make better decisions by understanding and improving the existing business models. On the other hand, Big Data analytics deals with the processing of a large volume of both structured or unstructured data which cannot be processed with the traditional methods. Big data is characterized by 3Vs: the volume, the variety and the velocity at which the data is processed. The key enablers for the growth of big-data are the increase of storage capacities, an increase of processing power, and the availability of huge amount of data. How is data analyzed? Big data and data science help organizations to understand their consumers, and identify new opportunities. Let’s understand how these are applied in real-world situations.

Hypothesis-based reasoning

: The hypothesis-based reasoning helps in formulating hypothesis about relationships between variables. It requires experimenting with data to test hypothesis and models.

Pattern-based reasoning

: The pattern-based reasoning helps to discover new relationships and the analytical path from the data. It involves drawing inferences based on probability. The conclusion reached from this technique is reasonable, probable and believable. On the contrary, big data analytics involves the following steps.

Data Integration

: Big data analytics starts with ingesting data from different sources. This is the first step towards the analysis. It requires integrating all types of structured, unstructured and semi-structured data. Examples include databases, mainframe, social media, file systems, SaaS applications, and XML.

Discovery

: The step involves understanding the data sets and how they relate to each other. The process consists of exploration and discovery of data.

Iteration

: Uncovering insights from data is an iterative process as the actual relationships are not known. Industry experts suggest small defined-projects to enable learning from the iterations. Classification and Prediction: Once the right data is collected, we go ahead for classifying and predicting the data. Classification models predict categorical data, and prediction models predict continuous data. Qualifications matter A critical component of any organization is its team. Both data science and big data require a diverse set of skills. Data scientist or big data analyst are the hottest job titles in the IT industry. Data scientists are highly educated with 88% have master’s degree and 46% have PhDs. They need to possess an in-depth knowledge of statistics with programming languages such as SAS, and R. Big data analysts must have technical knowledge along with the skills possessed by a data scientist. These include SQL databases and database querying languages, Python, Hadoop, Hive & Pig and cloud tools like Amazon S3. However, in both the fields, domain expertise significantly contributes to the understanding of where the problem lies and how the problems could be measured. Closing Thoughts Big data continues to occupy our day to day lives.  When properly infused and analyzed, big data analytics can provide unique insights hidden inside the data. Both data science and big data tools and techniques require a significant investment of time across an array of tasks. The dynamic nature of the field makes its necessary for organizations to understand both the terms. However, no matter, how many the differences are, one cannot be successful without the other.

10 Best Practices For Data Science Projects

The 10 best practices for data science projects that assist you in resolving real-world problems

The field of data science has earned the reputation of being the next big thing in technology and business. In recent years, the number of businesses using data science applications has only increased.

We’ll go over some of these best practices for data science projects in this article, which businesses can use to boost their data science efforts’ success rates. There are many practices for data science projects. But first, let’s learn more about the idea of data science projects. We have enlisted 10 best practices for data science.

First Choice: To get the support of the business, begin with a quick-win use case.

You must focus on use cases that share three essential characteristics:

A business leader who is ready to take responsibility for their success. The champion is essential for establishing the use case’s business significance and gaining executive and general business support.

Clearly defined KPIs to evaluate the impact on the business of the results. These are necessary to show business stakeholders the measurable impact of your project before and after.

Accessible, available, and clean information. If your data isn’t of high quality or readily available, you run the risk of turning your quick-win into a data cleansing exercise, which is not something you want to do if you want to maintain the interest of the business.

Second Best Practice: Establish a strong Data Science organization and team.

Third Best Practice: Choose the right tools and metrics for the job.

When it comes to metrics, it’s important to choose the right ones to connect data science results to business objectives. Predictive algorithms, for instance, are frequently evaluated using the Root-Mean-Square-Error (RMSE) metric; however, depending on the underlying business objective, the Logarithmic-Root-Mean-Square-Error metric may yield superior results. Metrics, on the other hand, are typically business KPIs like revenue or cost for optimization algorithms.

Fourth Best Practice: Establish an early POC dashboard for business stakeholders.

Gaining business support necessitates the early creation of a POC dashboard for business stakeholders. Participate in a Design Thinking workshop with business stakeholders to begin your project to accomplish this. During this meeting, come up with concepts and think about what a dashboard that is the result of the project would mean to them.

Fifth Best Practice: Spread the word widely and frequently.

Through regular reviews of the project’s progress, you can maintain business buy-in. In these reviews, let a stakeholder in your company lead the presentation. Instead of presenting the results in code, make use of your POC dashboard to present them in business language.

Sixth Best Practice: Use an Agile strategy.

An Agile Data Science approach should be used to guarantee consistent progress. This means that your project should be broken up into sprints of two to three weeks, with sprint reviews at the end of each cycle to show examples of the results achieved and Agile task planning. To contain the project’s scope, manage risks, and reduce uncertainty, invite all stakeholders to the sprint reviews and sprint planning.

Seventh Best Practice: Make provision for adaptable infrastructure.

When scaling up is necessary, the necessary infrastructure is not readily available, which is one of the primary reasons why Data Science POCs do not progress into the real world. The POC is then put on hold until infrastructure is acquired, which typically takes a long time or until the POC is forgotten.

Eighth Best Practice: During the POC phase, ask operational questions.

Find answers to operational questions like how often models will need to be tuned, how much data will be ingested (e.g., streaming vs. a scheduled job), how much data will be produced, and how much hardware will be needed during the POC.

Ninth Best Practice: Prepare a strategy to put your POC into action.

From day one, begin planning how to put your POC into production, and include a production plan in your final POC sprint review. You may be working with a subset of the data during the POC period; to implement your POC in the real world, you must also consider other data requirements, such as governance, volume, and the role of data stewards.

Tenth best practice: Optimized actions should replace insights and predictions.

What Are The Applications Of Electrolysis?

What is Electrolysis?

The process in which ionic substances are decomposed into simple substances by passing an electric current through them is known as electrolysis.

In other words, the process based on the fact that electrical energy can produce chemical changes is known as electrolysis.

Applications of Electrolysis

Nowadays the electrolytic process is widely used in various industrial applications. The major applications of the electrolysis are given below.

Extraction of Metal from their Ores

The electrolytic process is used for extracting out the pure metal from their ores, this process is known as electro-extraction. In the electro-extraction, the metal ore is treated with strong acid or is melted and then a DC current is passed through the resulting solution, the solution is decomposed and pure metal is deposited on the cathode.

Refining of Metals

Electrolysis is also used for refining of metals and the process is termed as electro-refining. In electro-refining, the anode of impure metal is placed in a suitable electrolytic solution. When DC current is passed through the solution, pure metal is deposited on the cathode.

Manufacturing of Chemicals

The electrolytic process is also used for manufacturing of various chemicals. When an electric current is passed through the solution of some compound, the compound gets breakdown into its constituent components which are liberated at the anode and cathode, which in turn can be collected.

Electro-Deposition

The electro-deposition is an electrolytic process, in which one metal is deposited over the other metal or non-metal. The electro-deposition is usually used for the decorative, protective and functional purposes.

Electroplating

An electrolytic process in which a metal is deposited over any metallic or non-metallic surface is called the electroplating. Electroplating is usually used to protect the metals from corrosion by atmospheric air and moisture.

Electro-deposition of Rubber

Electrolysis is also employed for electro-deposition of rubber. The rubber latex obtained from the tree consists of very fine colloidal particles of rubber suspended in water. These particles of rubber are negatively charged. On electrolysis of the solution, these rubber particles move towards the anode and deposit on it.

Electro-Metallization

The electrolytic process in which the metal is deposited on a conducting base for decorative and for protective purposes is termed as electro-metallization. Also, by using the electro-metallization process, any non-conductive base is made conductive by depositing a layer of graphite over it.

Electro-Facing

An electrolytic process in which a metallic surface is coated with a harder metal by electro-deposition in order to increase its durability is known as electro-facing.

Electro-Forming

Electrolysis is also used for electro-forming, it is the reproduction of an object by electro-deposition in order to increase its durability.

In the electro-forming, i.e. reproduction of medals, coins, etc., a mould is made by impressing the object in wax. The wax surface having exact impression of the object is coated by powdered graphite to make it conducting. This mould is then dipped in an electro-forming cell as cathode. After obtaining a coating of desired thickness, the article is removed and the wax core is melted out of the metal shell.

Electro-Typing

The electrotyping is an electrolytic process for forming metal parts that exactly reproduce a model. It is a special application of electro-forming and is mainly used to reproduce printing, set up tying, medals, etc.

Anodizing

The electrolysis process of deposition of an oxide film on a metal surface is known as anodizing. It is mainly used to increase the thickness of the natural oxide layer on the surface of the metal parts.

Electro-Polishing

The electro-polishing is an electrolytic process that removes materials from a metallic workpiece. It is also known as electrochemical polishing or electrolytic polishing.

Electro-polishing uses a combination of rectified current and a blended chemical electrolyte bath to remove flaws from the surface of a metal part.

Electro-Refining

Electro-refining is a method for purifying a metal using electrolysis. In the electro-refining process, the anode is made of impure metal and the impurities must be lost during the passage of metal from the anode to cathode during electrolysis.

Electro-Parting

An electrolytic process of separation of two or more metals is known as electro-parting or electro-stripping.

Electro-Cleaning

Electro-cleaning is the process of removing soil, scale or corrosion from a metallic surface. It is also known as electro-pickling. It is a form of electroplating which can be applied to all electrically conductive materials.

Update the detailed information about 11 Amazing Applications / Uses Of Data Science Today on the Eastwest.edu.vn website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!