For those of you anxiously awaiting the future, we’ve good news – it’s already here! Some technologies, which were just creative visions a decade back are all a reality now. Remember the scene in one of the back to the Future movies, where Marty McFly wears a virtual reality device? A reality now! The self-driving cars in Total Recall? Or the smart homes in Electric Dreams,? All are for real now.
Grayhats has been working on getting this technology to you at your finger tips at a an affordable cost. Grayhats IoT devices will be more of life hack device making your life really easy. And deliver international level customer experience.
Grayhats IoT devices will be made available at amazon, ebay, flipcart and other ecommerce platform.
We at grayhats thrive to get this high end technology to the smallest communities at the best price.
And what’s the technology that’s been driving these visions into a reality? It’s called the Internet of Things (IoT). This fancy name for all-things-connected has been a buzzword for quite some time in the IT, offering us convenient solutions to most of our everyday concerns.
For the uninitiated, IoT or the Internet of Things is the technical term for an ecosystem of connected devices that collect data from the sensors embedded in them and send them to the processing centres. The collected data then gets analyzed or processed and comes out as optimized information, offering smart, intuitive, and personalized solutions.
If you’re unaware, the technology is already up and running in the market and you might have used that at one point or the other, too. Your smartphone is one of the components of IoT and if you’ve used a smartwatch or a health band, you’ve used IoT. From smart homes, smart toothbrushes, smart refrigerators, to smart coffee brewers, IoT is touching everything
The Rise of Grayhats IoT
Like you can see, IoT is here to stay. As the everyday things we use start becoming smarter, the role of IoT will only become more prominent. Besides just personal devices, IoT is making its mark in industries and businesses such as manufacturing, oil and gas mining, transportation, agriculture, retail, logistics, infrastructure, banks, healthcare, aviation and more.
If numbers fascinate you, there are currently 6 billion connected devices globally, producing 2.5m TB of data. And let me remind you that this is just a day’s count. Three years down the line, this number is expected to shoot up to 30m TB every day. Now, that’s a lot of data. So, what’s clear from this is IoT will rule the IT for the next couple of decades.
What’s in it for You?
The answer is simple – a lucrative career and an avenue to explore and leave a remarkable impact. With the amount of data generated by these connected devices only increasing, a huge demand for data scientists and analysts to process the data and get crucial insights on things that matter to a business or the society is on the rise as well.
In case you didn’t know, tons of people out there have already woken up to IoT and have realized its scope and opportunities. Moreover, programmers today are reskilling to IoT and learning all the necessary programming languages and modules to become IoT experts. However, IoT being an extensive field, it’s not just placed for programmers but for anyone with the right skillset.
Please follow and like us:
Grayhats is exploring the impact of artificial intelligence on design in a series of blog posts. Our first piece outlined the unprecedented changes AI and IoT will bring to jobs in general, and his second piece looked at automation and intelligent machines. Here, we investigates what it takes for some professions to compete with machines.
For anyone doubting that AI is here, the New York Times recently reported that Carnegie Mellon University plans to create a research centre that focuses on the ethics of artificial intelligence. Harvard Business Review started laying the foundation for what it means for management, and CNBC started analysing promising AI stocks. I made the relatively optimistic case that design in the short term is safe from AI because good design demands creative and social intelligence.
But this short-term positive outlook did not alleviate all of my concerns. This year, my daughter started college, pursuing a degree in interaction design. As I began to explore how AI would affect design, I started wondering what advice I would give my daughter and a generation of future designers to help them not only be relevant, but thrive in the future AI world.
Here is what I think they should expect and be prepared for in 2020
Everyone will be a designer
Today, most design jobs are defined by creative and social intelligence. These skill sets require empathy, problem framing, creative problem solving, negotiation, and persuasion. The first impact of AI will be that more and more non-designers develop their creativity and social intelligence skills to bolster their employability. In fact, in the Harvard Business Review article I mentioned above, advice #4 to managers is to act more like designers.
The implication for designers is that more than just the traditional creative occupations will be trained to use “design thinking” techniques to do their work. Designers will no longer hold a monopoly (if that were ever true) on being the most “creative” people in the room. To stay competitive, more designers will need additional knowledge and expertise to contribute in multidisciplinary contexts, perhaps leading to increasingly exotic specializations. You can imagine a classroom, where an instructor trained in design thinking is constantly testing new interaction frameworks to improve learning. Or a designer/hospital administrator who is tasked with rethinking the inpatient experience to optimize it for efficiency, ease of use, and better health outcomes. We’re already seeing this trend emerge—the Seattle mayor’s office has created an innovation team to find solutions to Seattle’s most immediate issues and concerns. The team embraces human-centred design as a philosophy, and includes designers and design strategists.
Stanford’s school has been developing the creative intelligence of non-traditionally trained designers for over a decade. And new programs like MIT’s Integrated Design and Management program are also emerging. Even medical schools are starting to train future physicians in design thinking. This speaks to design’s broader relevance, but also to a new opportunity for educators across disciplines to include creative intelligence training and human-centered design in their curricula.
Designers as curators, not creators
The real breakthrough with DeepMind’s Deep Q, and its successor AlphaGo—the computer program that plays the board game Go—is that the AI doesn’t have any domain knowledge or expertise in game play. And it doesn’t even need someone to codify the rules of how to play. It just has visual input, controls, and an objective of trying to maximize its score. To that extent, games are an ideal test environment for artificial intelligence to learn.
But what about design? That’s where the curator role comes in. In the future, designers will train their AI tools to solve design problems by creating models based on their preferences.
For instance, after years of working in the health care space, Artefact has developed a deep and broad perspective on the key issues in digital health design necessary for changing patient behaviours. I can imagine a time when we will have enough data to enter behaviour goals and ask the AI system to design a solution framework that overcomes anticipated issues like confirmation bias and the empathy gap.
Designing AI, designing the future of humanity
By framing the argument to show how AI is stealing our design jobs, I’ve perhaps done a disservice to AI’s contributions to the design profession. When humans and computers work together, they can do amazing things that neither could do alone—just take a look at Michael Hansmeyer’s unimaginable shapes. With their millions of facets, these forms cannot be built by a human alone, yet they can redefine architecture.
While this is just one example, there is something undeniably appealing about finding ways to amplify our creativity as individuals and across professions. I can see the potential for a future where our personal AI assistants, armed with a deep understanding of our influences, heroes, and inspirations, constantly critique our work, suggesting ideas and areas of improvement. A world where problem-solving bots help us see a problem from a variety of perspectives, through different frameworks. Where simulated users test things we’ve designed to see how they will perform in a variety of contexts and suggest improvements, before anything is even built. Where A/B testing bots are constantly looking for ways to suggest minor performance optimizations to our design work.
Far from threatening the design occupation, AI offers a huge opportunity for design, especially for those involved in designing the interactions we have with the emerging AI systems. How do we design those AI design tools? How will we design the intelligent services and platforms of our future? How should we design these systems in a way that helps us augment our creativity, our relationships with the world, our humanity?
That is a tall order and an exciting opportunity for us and for the generations to come.
Please follow and like us:
What can Artificial Intelligence and IoT can do for us ?
Essentially, artificial intelligence is about creating smart machines and software that work and react like humans. The technology is likely much more varied and sophisticated than what you may have seen in science fiction movies. AI is being developed today that works on everything from speech recognition to analytics, problem solving, and beyond. AI can take on many different forms, from customer service robots to Internet-of-Things connected smart machines, and could look like anything from data processing machines to virtual assistants and sensor-based manufacturing components. And across all industries, it seems like AI is revolutionizing how people work.
Grayhats is trying to create products combing these technologies. We truly believe AI with data sciences and Internet of things can help us build and scale revolutionary products.
The idea of AI is far from now, it was first written about by Alan Turing in 1950, when he posed the question, “Can machines think?” The idea was particularly forward-thinking considering that the first general-purpose computer had just been created. AI has been in the works for decades, but it has only been recently that the technology has been in place to make theoretical dreams a reality. The two biggest developments to bring AI to the forefront were big data and computing power; AI relies on vast amounts of data to truly be an effective intelligent system, but up until recently that data wasn’t available and computers couldn’t have handled it if it was.
AI is also a fundamental part of the concept of the Internet of Things – a world where machines and devices all communicate with each other to get the work done, leaving us free to relax and enjoy life.
However, as we’ve previously seen with the internet revolution, and the big data revolution, and all the other technological revolutions of recent times, there are obstacles to be overcome before we reach this technological utopia. As businesses scramble for their share of a $70 billion market, some will inevitably prosper and some will fail. Those that manage to succeed are likely to be those which can manage to see beyond the hype – and answer hard questions about how this technology can add real value and drive positive change.
The concern that this technology will lead to widespread unemployment is also beyond the scope of this piece, but it does touch on the first point I want to make. Employees are often a business’s biggest expense, but does that mean it’s sensible to think of AI as primarily a means of cutting HR costs? I don’t think so. The fully autonomous, AI-powered, human-free industrial operation is still some way from becoming reality and human employees working alongside AI machines is likely to be the way of things for a while yet.
The field was founded on the claim that human intelligence “can be so precisely described that a machine can be made to simulate it”. This raises philosophical arguments about the nature of the mind and the ethics of creating artificial beings endowed with human-like intelligence, issues which have been explored by myth, fiction and philosophy since antiquity. Attempts to create artificial intelligence have experienced many setbacks, including the ALPAC report of 1966, the abandonment of perceptrons in 1970, the Light hill Report of 1973, the second AI winter 1987–1993 and the collapse of the Lisp machine market in 1987. In the twenty-first century, AI techniques have become an essential part of the technology industry, helping to solve many challenging problems in computer science.
Please follow and like us:
ZigBee is a wireless networking standard that is aimed at remote control and sensor applications which is suitable for operation in harsh radio environments and in isolated locations.
ZigBee technology builds on IEEE standard 802.15.4 which defines the physical and MAC layers. Above this, ZigBee defines the application and security layer specifications enabling interoperability between products from different manufacturers. In this way ZigBee is a superset of the 802.15.4 specification.
With the applications for remote wireless sensing and control growing rapidly it is estimated that the market size could reach hundreds of millions of dollars as early as 2007. This makes ZigBee technology a very attractive proposition for many applications.
ZigBee standards and ZigBee Alliance
The ZigBee standard is organised under the auspices of the ZigBee Alliance. This organisation has over seventy members, of which five have taken on the status of what they term “promoter.” These five companies are Ember, Honeywell, Invensys, Mitsubishi, Motorola, Philips, and Samsung. Under the umbrella of the ZigBee Alliance, the new standard will be pushed forward, taking on board the requirements of the users, manufacturers and the system developers.
ZigBee standards and releases
||Comments and details
||This was the original release of ZigBee – defined as ZigBee 1.0 which was publicly released in June 2005.
||This release of the ZigBee standard introduced the concept of a cluster library and was released in September 2006.
||The next version of the ZigBee standard was released publicly in October 2008 and contained two different profile classes
||ZigBee PRO was a profile class that was released in the ZigBee 2007 release. ZigBee PRO provides additional features required for robust deployments including enhanced security.
||RF4CE – Radio Frequency for (4) Consumer Electronics was a standard that was aimed at audio visual applications. It was taken on board by the ZigBee Alliance and the Version 1.0 of the standard was released in 2009.
The distances that can be achieved transmitting from one station to the next extend up to about 70 metres, although very much greater distances may be reached by relaying data from one node to the next in a network.
The main applications for 802.15.4 are aimed at control and monitoring applications where relatively low levels of data throughput are needed, and with the possibility of remote, battery powered sensors, low power consumption is a key requirement. Sensors, lighting controls, security and many more applications are all candidates for the new technology.
Physical and MAC layers
The system is specified to operate in one of the three license free bands at 2.4 GHz, 915 MHz for North America and 868 MHz for Europe. In this way the standard is able to operate around the globe, although the exact specifications for each of the bands are slightly different. At 2.4 GHz there are a total of sixteen different channels available, and the maximum data rate is 250 kbps. For 915 MHz there are ten channels and the standard supports a maximum data rate of 40 kbps, while at 868 MHz there is only one channel and this can support data transfer at up to 20 kbps.
The modulation techniques also vary according to the band in use. Direct sequence spread spectrum (DSSS) is used in all cases. However for the 868 and 915 MHz bands the actual form of modulation is binary phase shift keying. For the 2.4 GHz band, offset quadrature phase shift keying (O-QPSK) is employed.
In view of the fact that systems may operate in heavily congested environments, and in areas where levels of extraneous interference is high, the 802.15.4 specification has incorporated a variety of features to ensure exceedingly reliable operation. These include a quality assessment, receiver energy detection and clear channel assessment. CSMA (Carrier Sense Multiple Access) techniques are used to determine when to transmit, and in this way unnecessary clashes are avoided.
The data is transferred in packets. These have a maximum size of 128 bytes, allowing for a maximum payload of 104 bytes. Although this may appear low when compared to other systems, the applications in which 802.15.4 and ZigBee are likely to be used should not require very high data rates.
The standard supports 64 bit IEEE addresses as well as 16 bit short addresses. The 64 bit addresses uniquely identify every device in the same way that devices have a unique IP address. Once a network is set up, the short addresses can be used and this enables over 65000 nodes to be supported.
It also has an optional super frame structure with a method for time synchronisation. In addition to this it is recognised that some messages need to be given a high priority. To achieve this, a guaranteed time slot mechanism has been incorporated into the specification. This enables these high priority messages to be sent across the network as swiftly as possible.
Upper layers (ZigBee)
Above the physical and MAC layers defined by 802.15.4, the ZigBee standard itself defines the upper layers of the system. This includes many aspects including the messaging, the configurations that can be used, along with security aspects and the application profile layers.
There are three different network topologies that are supported by ZigBee, namely the star, mesh and cluster tree or hybrid networks. Each has its own advantages and can be used to advantage in different situations.
The star network is commonly used, having the advantage of simplicity. As the name suggests it is formed in a star configuration with outlying nodes communicating with a central node.
Mesh or peer to peer networks enable high degrees of reliability to be obtained. They consist of a variety of nodes placed as needed, and nodes within range being able to communicate with each other to form a mesh. Messages may be routed across the network using the different stations as relays. There is usually a choice of routes that can be used and this makes the network very robust. If interference is present on one section of a network, then another can be used instead.
Finally there is what is known as a cluster tree network. This is essentially a combination of star and mesh topologies.
Both 802.15.4 and ZigBee have been optimised to ensure that low power consumption is a key feature. Although nodes with sensors of control mechanisms towards the centre of a network are more likely to have mains power, many towards the extreme may not. The low power design has enabled battery life to be typically measured in years, enabling the network not to require constant maintenance.
Although there is an increasing number of wireless standards that are appearing, ZigBee has a distinct area upon which it is focussed. It is not inteneded to compete with standards such as 802.11, Bluetooth and the like. Instead it has been optimised to ensure that it meets its intended requirements, fulfilling the needs for remote control and sensing applications.
Please follow and like us:
How to Think about the Internet of Things (IoT)
Many people have tried to define the Internet of Things. But as a hardware or software engineer, you already know the essential element: to build interconnected products.
Embedded systems are already playing a crucial role in the development of the IoT. In broad strokes, there are four main components of an IoT system:
- The Thing itself (the device)
- The Local Network; this can include a gateway, which translates proprietary communication protocols to Internet Protocol
- The Internet
- Back-End Services; enterprise data systems, or PCs and mobile devices
IoT systems are not complicated, but designing and building them can be a complex task. And even though new hardware and software is being developed for IoT systems, we already have all the tools we need today to start making the IoT a reality.
We can also separate the Internet of Things in two broad categories:
- Industrial IoT, where the local network is based on any one of many different technologies. The IoT device will typically be connected to an IP network to the global Internet.
- Commercial IoT, where local communication is typically either Bluetooth or Ethernet (wired or wireless). The IoT device will typically communicate only with local devices.
So to better understand how to build IoT devices, you first need to figure out how they will communicate with the rest of the world.
Your Local Network
Your choice of communication technology directly affects your device’s hardware requirements and costs. Which networking technology is the best choice? IoT devices are deployed in so many different ways — in clothing, houses, buildings, campuses, factories, and even in your body — that no single networking technology can fit all bills.
Let’s take a factory as a typical case for an IoT system. A factory would need a large number of connected sensors and actuators scattered over a wide area, and a wireless technology would be the best fit.
Wireless sensor network installed in a factory, connected to the Internet via a gateway
A wireless sensor network (WSN) is a collection of distributed sensors that monitor physical or environmental conditions, such as temperature, sound, and pressure. Data from each sensor passes through the network node-to-node.
WSN nodes are low cost devices, so they can be deployed in high volume. They also operate at low power so that they can run on battery, or even use energy harvesting. A WSN node is an embedded system that typically performs a single function (such as measuring temperature or pressure, or turning on a light or a motor).
Energy harvesting is a new technology that derives energy from external sources (for example, solar power, thermal energy, wind energy, electromagnetic radiation, kinetic energy, and more). The energy is captured and stored for use by small, low-power wireless autonomous devices, like the nodes on a WSN.
WSN Edge Nodes
A WSN edge node is a WSN node that includes Internet Protocol connectivity. It acts as a gateway between the WSN and the IP network. It can also perform local processing, provide local storage, and can have a user interface.
The battle over the preferred networking protocol is far from over. There are multiple candidates.
The first obvious networking technology candidate for an IoT device is Wi-Fi, because it is so ubiquitous. Certainly, Wi-Fi can be a good solution for many applications. Almost every house that has an Internet connection has a Wi-Fi router.
However, Wi-Fi needs a fair amount of power. There are myriad devices that can’t afford that level of power: battery operated devices, for example, or sensors positioned in locations that are difficult to power from the grid.
Newer networking technologies are allowing for the development of low-cost, low-power solutions. These technologies support the creation of very large networks of very small intelligent devices. Currently, major R&D efforts include:
- Low-power and efficient radios, allowing several years of battery life
- Energy harvesting as a power source for IoT devices
- Mesh networking for unattended long-term operation without human intervention (for example, M2M networks)
- New application protocols and data formats that enable autonomous operation
For example, EnOcean has patented an energy-harvesting wireless technology to meet the power consumption challenge. EnOcean’s wireless transmitters work in the frequencies of 868 MHz for Europe and 315 MHz for North America. The transmission range is up to 30 meters in buildings and up to 300 meters outdoors.
One of the major IoT enablers is the IEEE 802.15.4 radio standard, released in 2003. Commercial radios meeting this standard provide the basis for low-power systems. This IEEE standard was extended and improved in 2006 and 2011 with the 15.4e and 15.4g amendments. Power consumption of commercial RF devices is now cut in half compared to only a few years ago, and we are expecting another 50% reduction with the next generation of devices.
Devices that take advantage of energy-harvesting must perform their tasks in the shortest time possible, which means that their transmitted messages must be as small as possible. This requirement has implications for protocol design. And it is one of the reasons why 6LoWPAN (short for IPv6 over Low power Wireless Personal Area Networks) has been adopted by ARM (Sensinode) and Cisco (ArchRock). 6LoWPAN provides encapsulation and header compression mechanisms that allow for briefer transmission times.
Wireless radio technologies
||868/915 MHZ, 2.4 GHZ
||2.4, 5.8 Ghz
||11 to 105 Mpbs
||10 to 300 m
||10 to 100 m
||Alkaline (months to years)
||Rechargeable (days to weeks)
There are many wireless networks available that are specialized for various industries. The following is a brief list:
||Zigbee and Zigbee IP
And there are many more.
At Grayhats, we believe that any protocol that carries IP packets has an advantage over all others. The connectivity requirements for IoT devices are so diverse that a single technology cannot meet all the range, power, size and cost requirements. Nonetheless, we believe that 6LoWPAN will be the choice for WSNs and light IP-based protocols (see next section).
IPv6 is Key for IoT
If your IoT network is local and M2M-only, then the wireless protocols discussed above are all good candidates. But if your goal is to remotely control devices or otherwise transmit data over the Internet, then you need IPv6.
The usefulness of IoT devices resides not only in local communication, but also in global communication. If at all possible, it is crucial that your IoT networks (LANs, PANs, and BANs) all make use of the suite of Internet Protocols (IP, UDP, TCP, SSL, HTTP, and so on). Furthermore, your networks must support Internet Protocol version 6, as the current IPv4 standard faces a global addressing shortage, as well as limited support for multicast, and poor global mobility.
IPv6’s addressing scheme provides more addresses than there are grains of sand on earth — some have calculated that it could be as high as 1030 addresses per person (compare that number to the fact that there are 1028 atoms in a human body!). With IPv6, it is much simpler for an IoT device to obtain a global IP address, which enables efficient peer-to-peer communication.
The importance of IP to the Internet of Things does not automatically mean that non-IP networks are useless. It just means that non-IP networks require a gateway to reach the Internet.
Referring back to the illustration at the top of the page, you can see clearly that your local network is only one part of the Internet of Things. 6LowPAN, because it carries an IPv6 address with a compressed header, offers Internet connectivity without too much additional overhead. 6LoWPAN has also an advantage over other personal area networks, because peer-to-peer communication is simpler to implement when each device has a global address.
Please follow and like us: