Home
Conference Info
Sponsorship Information
Exhibitors
Registration
Press Registration
  Topics
  Call For Papers
  Past Events
Untitled Document
2017 West
Premium Sponsors
Diamond



Platinum
@DevOpsSummit

Bronze










Untitled Document
2017 West
Keynote Sponsor


Untitled Document
2017 West Exhibitors
























@ThingsExpo











Untitled Document
2017 West JETRO ×
Six Prefectures
of Japan
Pavilion Exhibitors



















Untitled Document
2017 West Media Sponsors














Untitled Document
2017 East
Premium Sponsors
Diamond



Platinum
@DevOpsSummit

@DevOpsSummit

Silver
@DevOpsSummit


Bronze










Untitled Document
2017 East Exhibitors
@DevOpsSummit




































Untitled Document
2017 East Media Sponsors
















Untitled Document
2016 West
Premium Sponsors
Platinum Plus



Silver
@ThingsExpo

Bronze







Untitled Document
2016 Welcome Reception Sponsor

Untitled Document
2016 West Exhibitors










@DevOps Summit






@DevOps Summit

@WebRTC Summit












@WebRTC Summit









@DevOps Summit

Untitled Document
2016 West Media Sponsors











Untitled Document
2016 East Gold Sponsors

@ThingsExpo

Untitled Document
2016 East Silver Sponsors


@DevOps Summit

Untitled Document
2016 East Bronze Sponsors

Cloud Expo







Cloud Expo

Untitled Document
2016 East Vendor Presentation Sponsors

@DevOps Summit

Untitled Document
2016 East Exhibitors

@DevOps Summit





@ThingsExpo



@DevOps Summit

@ThingsExpo


@DevOps Summit









@DevOps Summit







@DevOps Summit










Untitled Document
2016 East Media Sponsors










Search-Enable Your Application with Lucene
Search-Enable Your Application with Lucene

The e-commerce Web site that I work on has seen several incarnations of its search feature. We started with plain vanilla SQL using "like" clauses, but this didn't perform well and left a lot to be desired in language features such as stemming (e.g., "paint" = "painter" = "painting") and synonym matching (e.g., "cat" = "feline"). Next we tried an off-the-shelf solution. This addressed our efficiency and language demands, but it was ridden with strange quirks and we were limited in how much we could customize its behavior.

Then we discovered Lucene. Lucene is an open-source search framework from Apache's Jakarta project. As a framework, Lucene provides you with the building blocks you need to build a search engine that meets your specific searching requirements. Lucene is flexible, fully customizable, and amazingly fast.

In this article I show you how to use Lucene to build a search solution for your application. Although my examples will be geared toward an e-commerce application, Lucene is flexible enough to be used on any application whether it's Web, desktop, or CD-ROM based.

I used version 1.2 of Lucene to develop the examples in this article. It can be downloaded from http://jakarta.apache.org/lucene. Lucene is self-contained, so you'll need only a JVM (v1.1.8 or higher) to use it. Place lucene-1.2.jar into your classpath and you're ready to start.

Indexing Documents
To build a Lucene index, first you'll need an instance of IndexWriter. The following lines of code create an IndexWriter for an index located at c:\myindex.

Analyzer analyzer = new StopAnalyzer();
writer = new IndexWriter("c:/myindex", analyzer, true);

The first argument to the constructor is the path where the index will be written. If the path doesn't already exist, Lucene will create it for you. The second argument is the Analyzer you want IndexWriter to use when tokenizing text. Here I used StopAnalyzer to remove stop words ("and," "or," "the," etc.) from the token stream. The last argument tells IndexWriter whether to create a new index or to add documents to an existing one. Passing true to the constructor will create the index from scratch; passing false will append to an existing index.

Now that you have an IndexWriter, you're ready to start adding documents to the index. The following code creates a simple document that represents a Web page and uses IndexWriter to add it to the index.

String url = "http://jakarta.apache.org/lucene";
String content = indexer.retrieveWebPageContent(url);
String keywords = indexer.extractKeywords(content);

Document doc = new Document();
doc.add(Field.UnIndexed("url", url));
doc.add(Field.UnStored("keywords", keywords));
doc.add(Field.Text("content", content));
writer.addDocument(doc);

In this example, the document contains the URL metadata for Lucene's homepage, a keywords field that contains search terms to match against in a search, and a "content" field that contains the full content of the Web page.

Once all documents have been added, all that remains is to close the index.

writer.close();

Although this example adds only a single (hard-coded) document to an index, it serves well as a "Hello World" example of how to create indexes using Lucene. The complete source code for this example is in Listing 1. (Listings 1-10 can be downloaded from www.sys-con.com/java/sourcec.cfm.)

For a more interesting example, suppose you're indexing a product catalog to be searched on an e-commerce Web site. A product is made up of a SKU, a name, a price, and some keywords to be searched on (see Listing 2). ProductIndexer (see Listing 3) is a convenience class used to add products to a Lucene index.

The constructor for ProductIndexer takes a string that's the path where the Lucene index will be built and a Boolean parameter that specifies whether a new index will be created or an existing index appended. ProductIndexer uses StopAnalyzer for tokenizing text.

The addProduct() method creates an instance of Document and translates the attributes of the Product into document fields. As in the simple example earlier, the "keywords" field is created as unstored so it can be searched upon but is unavailable for retrieval. The other fields are created as unindexed because these fields will be retrieved only after a successful search, not searched upon themselves.

The close() method closes the IndexWriter, making it available for searching. Before closing, however, a call is made to the IndexWriter's optimize() method to have Lucene optimize the index. Although it's entirely optional, it's generally a good idea to call optimize() if the indexing is finished for the time being and no further documents will be added to the index for a while.

ProductDBIndexer (see Listing 4) reads products from a "catalog" table in a relational database (see Table 1 for the products that I used) and uses ProductIndexer to add the products to Lucene's index. ProductDBIndexer takes two command-line arguments: the path in which to build the index and an optional "create" flag to indicate that the index should be built from scratch.

Lucene Index Structure
Lucene indexes are file based. If you look in the directory where you created the index, you'll find several files that define the Lucene index. Depending on how large your index is, you'll see several groups of files where each file in a group has the same name but a different extension. Each of these groups is known as a "segment." Although this article won't delve into the details of how Lucene segments work, it may be interesting to note that IndexWriter's optimize() method optimizes Lucene's index by consolidating all segments into a single segment for more efficient searching.

While IndexWriter is writing indexes, a file called "write.lock" is created. This file prevents other instances of IndexWriter from writing to the index concurrently. Calling IndexWriter's close() method removes this file and makes the index available for writing by another IndexWriter.

Lucene keeps track of each segment in the index using a file called "segments". During indexing, it occasionally becomes necessary for Lucene to update the segments file to keep it synchronized with the segments in the index. While this synchronization is going on, Lucene creates a "commit.lock" file to prevent concurrent updates of the segments file. Once the segments file is in sync, the commit.lock file is removed.

What would happen if you were to write to an index while it's being searched on? You may write to the index (either by adding new documents or re-creating the index from scratch) while it's being searched, but doing so may have undesirable effects on the search results. The worst side effect that I've seen is a document appearing out of order in the Hits collection. Depending on how important the ordering is to you, it may be best to create your indexes off-line (i.e., in another directory) and then rename the directory to become the current index.

Searching
Now that you've built an index, it's time to perform search queries against it. ProductSearcher (see Listing 5) shows how to do this.

To search a Lucene index you need an instance of org.apache.lucene.search.Searcher. Two subclasses of Searcher come with Lucene. IndexSearcher is for searching a single Lucene index while MultiSearcher is used to search multiple indexes at once. Only the product catalog index will be searched, so IndexSearcher is the best choice for this example. It's constructed given the path to the index.

Searcher searcher = new IndexSearcher(indexPath);

Next you must construct a Query object. The best way to do this is to use the parse() method of org.apache.lucene.queryParser.QueryParser. Create an instance of QueryParser, passing the name of the default field (the field that's searched upon by default) and an analyzer to the constructor. Then call parse() on the QueryParser instance passing the query string. An instance of org.apache .lucene.search.Query will be returned.

QueryParser queryParser = new QueryParser("keywords", new StopAnalyzer());
Query query = queryParser.parse("cat food");

Note: QueryParser is not thread-safe. A new instance of QueryParser should be created for each thread.

For this example the choice of query string is hard coded as "cat food". This query will result in all documents containing either "cat" or "food", but not necessarily both. It's possible to require that a document's keyword field contain "cat" and "food" when searching. Simply place a plus (+) sign in front of each word so that the search string will be "+cat +food" to require resulting documents to contain both "cat" and "food" in their keyword field. More advanced search options will be discussed later.

Next make a call to the Searcher's search() method, passing in the Query object.

Hits hits = searcher.search(query);

The search() method returns an instance of org.apache.lucene.search.Hits. The Hits class represents a collection of documents matching the search criteria, along with each document's relevancy score. These scores range from 0.0 to 1.0 where 1.0 is considered highly relevant and 0.0 is considered completely irrelevant (and not included in the Hits collection).

Finally, cycle through each Document returned in the Hits object displaying the SKU and name of the product along with its relevancy score.

for (int i = 0; i < hits.length(); i++) {
Document document = hits.doc(i);
float score = hits.score(i);
System.out.println(document.get("sku") + " :: " +
document.get("name") + " :: " + score);
}

Advanced Queries
Up until now, the queries have been relatively simple ones such as "cat food" and "+cat +food". QueryParser has a powerful selection of query operators to facilitate more complex searches. Table 2 lists all of QueryParser's operators.

Wildcard queries are fairly straightforward. The "*" operator can be replaced by zero or more characters to match a word. The "?" operator is replaced by exactly one character when matching. For example, "ca*" will match "cat", "car", "cap", or "candle", while "ca?" will match "cat", "car", and "cap", but not "candle". This is consistent with the behavior of "*" and "?" on a DOS or Unix command line.

The tilde (~) character, when used alone, performs a fuzzy search, matching words that are spelled similarly. For example, "cat~" will match "cat", but it will also match "car" and "rat" because these words are similarly spelled.

Surrounding two or more words with quotes (" ") produces a phrase. When two or more words are part of a phrase, those words must appear together in order to be considered a match. For example, ""dog food"" will match documents where "dog" is immediately followed by "food".

If a tilde and a number follow a phrase, then a proximity search is performed. For example, ""dog food"~10" will produce results where "dog" and "food" are found within 10 words of each other, but not necessarily adjacent to each other.

The carat (^) is a term booster. What this means is that any word followed by a carat is considered to have higher relevance than words not followed by a carat. For example, "dog^ kennel" will match where the document contains "dog" or "kennel", but will give a higher relevance to documents containing "dog".

The Boolean operators, AND, OR, and NOT behave as you would expect them to. For example, "(cat AND food) OR bird" returns all documents containing "cat" and "food" along with all documents that contain "bird". "cat NOT food" returns all documents containing "cat", but not containing "food". As you have seen before in the simple "cat food" example, OR is the default conjunction operator.

As shown in the previous example, parentheses can be used to group terms into subqueries.

As discussed, the plus sign (+) requires that a word or phrase exist in a field. Conversely, the minus sign (-) prohibits a word from appearing in the results and is roughly equivalent to NOT. For example, "dog -food" returns all documents containing "dog" but not containing "food".

Finally, there are times when you may want to search multiple fields. When constructing a QueryParser, you must specify a default field to be searched upon. Unless you specify otherwise, any words in your query will be looked for in the default field. In the examples, "keywords" is the default field. You can search on nondefault fields (assuming that they're indexed) by using a colon (:). For example, had the name field been tokenized and indexed, the query string "+cat +name:nummies" would return all documents in which the keywords field contains "cat" and the name field contains "nummies".

Customizing Lucene
While Lucene comes with an impressive set of functionality, you may still find that you want it to do something more or different than is available out of the box. As a search framework, Lucene provides several hooks for you to extend and/or modify its behavior.

In the previous examples, the analyzer chosen was StopAnalyzer. Underneath the covers, Stop-Analyzer uses LetterTokenizer to tokenize text into individual words. LetterTokenizer treats any nonalphabetic character as a delimiter. This is fine in most cases, but what if you want to tokenize text that contains numeric characters ("0" - "9") as well as alphabetic characters? This would be desirable if the keyword text contains part numbers or model numbers. LetterTokenizer wouldn't help in this case.

Listing 6 defines AlphanumericTokenizer, a tokenizer that works like LetterTokenizer except for one small difference: it treats numeric characters as token characters along with alphabetic characters. It does this by subclassing LetterTokenizer and overriding the isTokenChar() method to return the results of LetterTokenizer's isTokenChar() implementation OR'd with a call to Character.isDigit().

AlphanumStopAnalyzer (see Listing 7) is an analyzer that uses AlphanumericTokenizer. The stop-word behavior of StopAnalyzer is still desired, so AlphanumericTokenizer is wrapped with a StopFilter. To normalize the text to lowercase, StopFilter is then wrapped with LowercaseFilter. AlphanumStopAnalyzer is functionally equivalent to StopAnalyzer, except, since it uses AlphanumericTokenizer, it does not treat numeric characters as delimiters. To try out AlphanumStopAnalyzer, use it in place of StopAnalyzer in both ProductIndexer and ProductSearcher. Be sure to reindex with ProductIndexer before searching the index with the new analyzer.

Suppose that synonym-matching capability is required so that "cat" will match "kitten", "kitty", or "feline". AliasFilter (see Listing 8) is a subclass of TokenFilter that does this. AliasFilter retrieves its synonym list from entries in AliasFilter.properties. For example:

cat=feline kitten kitty
dog=canine puppy mutt
food=feed chow
parrot=bird

With each invocation of next(), AliasFilter first checks to see if there are any synonyms in the alias stack. If there are, it pops the next alias off the stack and returns it. Otherwise, AliasFilter retrieves the next token from the input TokenStream, adds any aliases that may exist to the alias stack, and then returns the next token.

AliasAnalyzer (see Listing 9) constructs a TokenStream that does everything the TokenStream from Alphanum-StopAnalyzer does, but it also uses AliasFilter to add synonyms to the TokenStream. To try AliasAnalyzer, use it as your analyzer instead of StopAnalyzer in both ProductIndexer and ProductSearch. Again, be sure to reindex before searching.

When trying AliasFilter you may discover some strange, albeit desirable, behavior. Search for "feline". Even though there are no aliases for feline, all cat-related products appear in the search results. Why? When you use AliasAnalyzer to search for "feline", the token stream does not expand beyond "feline". So why do "cat" products appear? The reason is, you also used AliasAnalyzer to index the products. When you indexed a product containing "cat", AliasAnalyzer expanded the token stream to include "kitten", "kitty", and "feline" in the index. When searching for "feline" it will be found in products whose token stream was expanded to include "feline". In effect, you get an automatic two-way aliasing between "cat" and "feline", even though it appears to be only one way in AliasFilter.properties.

Another common problem in searching is paging the results. A search query could return anywhere from zero results to a seemingly infinite number of result documents. Good usability practices suggest that you page the results, showing the user only a handful at a time. This can be accomplished in Lucene using result filters.

To create a result filter, you must subclass org.apache.lucene.search.Filter. The only required method is the bits() method. It will return a java.util.BitSet where each bit represents a document in the result set. If the bit is true, the document will be returned in Hits, otherwise it won't be returned.

PageFilter (see Listing 10) is an example of a Filter that's used to paginate search results. Given a page number and a page size, PageFilter will pare down Lucene's result set to a specific page's subset of documents. It does this by creating a BitSet big enough to hold the maximum number of result bits and then looping through the bits that need to be turned on. To use PageFilter, change ProductSearcher's call to search() to look like this:

Hits hits = searcher.search(query,new PageFilter(1,20));

This new call to search() will result in showing only the second set of 20 results.

Conclusion
Building a full-featured search engine can be a daunting task. But, thanks to Lucene, much of the complicated details are abstracted behind an easy-to-use API. We've seen how easy it can be to create an index for searching practically any type of information. We've also seen how Lucene is flexible and can be extended to satisfy custom indexing and searching requirements.

Resources

  • Jakarta Lucene: http://jakarta.apache.org/lucene
  • NLucene, the .NET implementation of Lucene at SourceForge: http://sourceforge.net/projects/nlucene
  • JGuru FAQ on Lucene: www.jguru.com/faq/Lucene
  • About Lucene's creator, Doug Cutting: http://lucene.sourceforge.net/background.html

    SIDEBAR
    Index Components
    A Lucene index is a collection of documents organized in a way that allows quick retrieval of information when arbitrarily queried upon.

    Each document (implemented by org.apache.lucene.document.Document) in a Lucene index is made up of one or more fields that are name-value pairs, much like entries in a HashMap. A document can contain as much or as little information as is required to be searched upon. For example, a Lucene document could contain the complete contents of a Web page, text file, e-mail, etc. On the other hand, a Lucene document may contain only a minimal set of metadata, such as keywords, along with a URL, a product SKU, or some other identifying information used to reference a full information source stored outside of Lucene (such as in a file system or a relational database).

    Each field in a document can be defined as being any combination of stored, indexed, and tokenized. If a field is stored, its contents are fully retrievable upon a successful search. If a field is indexed, its content may be referenced in a query and searched upon. If a field is tokenized, its content is broken into one or more tokens (or words) prior to being indexed.

    Fields can be created using org.apache.lucene.document.Field. The Field class has several static factory methods that make short work of creating field entries. Table 3 illustrates these static methods and the types of Fields that they create.

    Why would you want to index a field, but not store it? Consider a field that contains keywords for your document: chances are you'll never display or perform any processing of this field, but you still want to be able to search upon it. By indexing it you're making the field searchable, but by not storing it, you're saving space because the text is not written verbatim to the index. On the other hand, you may want to store some data so that it can be retrieved later but not actually be able to search upon it. In that case, you'd choose a field that's stored but not indexed. When defining your fields, be mindful of what those fields will be used for, and for efficiency's sake choose an appropriate field definition.

    SIDEBAR
    Search Components
    A Searcher (org.apache.lucene.search.Searcher) is used to access a Lucene index and query its contents. There are two subclasses of Searcher: IndexSearcher that searches a single index and MultiSearcher that searches one or more indexes and collects all the results in a single result set.

    Searches are performed by calling one of Searcher's search() methods and passing it a query (org.apache.lucene.search.Query). The search method returns an instance of org.apache.lucene.search.Hits. The Hits class is an array-like collection of documents that matches your query. The documents are ordered in Hits by a relevancy score.

    A Query object can be constructed using org.apache.lucene.query-Parser.QueryParser. QueryParser's parse() method parses a query string that's written in its query language and builds an appropriate Query object for that query string. QueryParser also uses an Analyzer in performing the parsing of the query string. It's not required, but it is strongly recommended that you use the same Analyzer for parsing queries that you used when indexing your documents.

    SIDEBAR
    Text Analysis Components
    When a field is tokenized, its content is broken into one or more tokens or words. Facilitating this tokenization process is the notion of an analyzer (see Figure 1). An analyzer is any subclass of org.apache.lucene.analysis.Analyzer that defines the rules for tokenization.

    A token stream is an iterator that returns the next token with each call to its next() method or returns a null when there are no more tokens in the stream. Two important subclasses of TokenStream are Tokenizer and TokenFilter. Both of these classes are abstract and must be subclassed to define the specific rules on how to tokenize content.

    At the core of the tokenization process is a Tokenizer. A Tokenizer wraps an instance of java.io.Reader and performs the actual work of breaking a stream into individual tokens (not unlike the notion of a StringTokenizer).

    TokenFilters act as decorators of other TokenStreams. Token filters can be used to add, replace, or remove tokens from a TokenStream. For example, org.apache.lucene.analysis.PorterStemFilter is a TokenFilter that replaces each word in a TokenStream with its word stem (e.g., "painting" becomes "paint").

    Analyzers rely on token streams (subclasses of org.apache.lucene.analysis.TokenStream) in defining the tokenization rules. In fact, an analyzer is nothing more than a factory for creating instances of TokenStream.

    To see how the text analysis components are used together, consider some of the TokenStream and Analyzer implementations packaged with Lucene. StopAnalyzer is an analyzer whose job is to remove stop words (e.g., "and", "or", "the", etc.) from a tokenized stream. At the core of StopAnalyzer is an instance of LowerCaseTokenizer. It tokenizes the stream into individual words, normalizing them to lowercase as it goes, where any nonalphabetic character is considered a delimiter. An instance of StopFilter decorates LowerCaseTokenizer, removing stop words from the stream as they're found. StopAnalyzer's tokenStream() method is merely a factory method that returns the decorator chain made up of LowerCaseTokenizer and StopFilter.

    About Craig Walls
    Craig Walls is the manager of Internet development for a Dallas, Texas-based retailer. He has eight years of experience in software development, six in Java. Craig is a Sun Certified Java programmer and a Sun Certified architect for the Java platform. He holds a BS in computer science from New Mexico State University.

  • In order to post a comment you need to be registered and logged in.

    Register | Sign-in

    Reader Feedback: Page 1 of 1

    I tried to implement the PageFilter, but it did not return any results. Any help would be appreciated. Thanks

    I want some help about p2p in C# with APIs
    please help me with your source codes.
    thanks.
    yours: Aso.


    Presentation Slides
    The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
    Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
    Mobile device usage has increased exponentially during the past several years, as consumers rely on handhelds for everything from news and weather to banking and purchases. What can we expect in the next few years? The way in which we interact with our devices will fundamentally change, as businesses leverage Artificial Intelligence. We already see this taking shape as businesses leverage AI for cost savings and customer responsiveness. This trend will continue, as AI is used for more sophistica...
    There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...
    Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve f...
    In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
    A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
    Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, discussed how they built...
    Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
    In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application p...
    Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
    With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
    As many know, the first generation of Cloud Management Platform (CMP) solutions were designed for managing virtual infrastructure (IaaS) and traditional applications. But that's no longer enough to satisfy evolving and complex business requirements. In his session at 21st Cloud Expo, Scott Davis, Embotics CTO, explored how next-generation CMPs ensure organizations can manage cloud-native and microservice-based application architectures, while also facilitating agile DevOps methodology. He expla...
    To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
    DevOps promotes continuous improvement through a culture of collaboration. But in real terms, how do you: Integrate activities across diverse teams and services? Make objective decisions with system-wide visibility? Use feedback loops to enable learning and improvement? With technology insights and real-world examples, in his general session at @DevOpsSummit, at 21st Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, explored how leading organizations use data-driven DevOps to close th...
    Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, discussed how given the magnitude of today's application ...
    No hype cycles or predictions of a gazillion things here. IoT is here. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, an Associate Partner of Analytics, IoT & Cybersecurity at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He also discussed the evaluation of communication standards and IoT messaging protocols, data...
    In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
    Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
    The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
    The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
    As you move to the cloud, your network should be efficient, secure, and easy to manage. An enterprise adopting a hybrid or public cloud needs systems and tools that provide: Agility: ability to deliver applications and services faster, even in complex hybrid environments Easier manageability: enable reliable connectivity with complete oversight as the data center network evolves Greater efficiency: eliminate wasted effort while reducing errors and optimize asset utilization Security: imple...
    Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
    In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, described how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching ...
    You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
    Blockchain is a shared, secure record of exchange that establishes trust, accountability and transparency across business networks. Supported by the Linux Foundation's open source, open-standards based Hyperledger Project, Blockchain has the potential to improve regulatory compliance, reduce cost as well as advance trade. Are you curious about how Blockchain is built for business? In her session at 21st Cloud Expo, René Bostic, Technical VP of the IBM Cloud Unit in North America, discussed the b...
    Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
    In this strange new world where more and more power is drawn from business technology, companies are effectively straddling two paths on the road to innovation and transformation into digital enterprises. The first path is the heritage trail – with “legacy” technology forming the background. Here, extant technologies are transformed by core IT teams to provide more API-driven approaches. Legacy systems can restrict companies that are transitioning into digital enterprises. To truly become a lead...
    IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
    Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...

    Register and Save!
    Save $1,505
    on your “Golden Pass”!
    before January 31, 2018!
    Call 201.802.3020


    New York Call for Papers Open
    Submit
    submit your speaking proposal
    for the upcoming Cloud Expo in
    New York City!
    [June 5 - 7, 2018]


    Cloud Expo 2018 East
    Sponsorship Opportunities
    Please Call
    201.802.3021
    events (at) sys-con.com
    Sponsorship opportunities are now open for Cloud Expo 2018 New York, June 5-7, 2018, at the Javits Center in New York, NY, and for Cloud Expo 2018 Silicon Valley, November 6-8, 2018, at the Santa Clara Convention Center in Santa Clara, CA. For sponsorship, exhibit opportunities and show prospectus, please contact Carmen Gonzalez, carmen (at) sys-con.com.



    Cloud Expo Silicon Valley All-Star Speakers Include

    RICHARDS
    Venafi

    SINGH
    IBM

    NIELSON
    Redis Labs

    FRIEDMAN
    Edge

    THIELE
    Apcera

    MONTE-
    CILLO

    IBM

    SPROULE
    Metavine

    REUVENI
    Jet

    CHITTURI
    Sungard

    CEPPI
    Canonical

    HAFF
    Red Hat

    CHAMBLISS
    ReadyTalk

    BLACK
    SQLstream

    KOCHER
    Grey Heron

    VISWA-
    NATHAN

    Cognizant

    LARSSON
    Qosmos

    COHEN
    Institute

    SCHEELE
    Loodse

    FEATHER-
    STON

    Collaborative
    Consulting

    SEHGAL
    Tintri

    SRINIVAS
    IBM

    AHUJA
    Impiger

    SUDH-
    AKAR

    Splunk

    TISH-
    GART

    Cloudera

    KENDRICK
    Isomorphic

    KUCKEIN
    DDN

    RAVE
    Teridion

    CHOU
    Microsoft

    SHARIF
    Aporeto

    JAME-
    NSKY

    Embotics

    JENKINS
    IBM

    DOYLE
    eCube

    THOM-
    CHICK

    Symantec

    BRENTON
    CollabNet

    BERCO-
    VICI

    SolidFire

    BABIN
    Kaltura

    MORAN
    BMC

    CASEY
    CFN

    ZHENG
    CDS

    DEMEO
    Alfresco

    STIGTER
    TIBCO

    LEUFFEN
    Wrecker

    ROGERS
    Anexia

    NEW
    HOUSE

    Agilitiv

    DE MENO
    Commvault

    VAN TUIN
    Red Hat

    LIANG
    Rancher Labs

    MURTHY
    Cloud Raxak

    REMIL-
    LARD

    Microsoft

    MANN
    Splunk

    ROWE
    IBM Cloud

    SKILLERN
    Intel

    ALLEN
    Progress

    BENEDICT
    Cognizant
    Cloud Expo New York All-Star Speakers Include

    DE MENO
    Commvault

    ERWAY
    Appneta

    OXENHORN
    FalconStor

    HINCH-
    CLIFFE

    7Summits

    DOYLE
    eCube

    CHAVES
    Pythian

    BLOOMBERG
    Intellyx

    BUGAYENKO
    Teamed.io

    LAWSON
    NewSci

    HAFF
    RedHat

    HOLT
    IBM

    Jewell
    Codenvy

    MORGENTHAL
    CSC

    ARMSTRONG
    AppNeta

    DWYER
    Iron.io

    MATSUMURA
    Gradle

    WARFIELD
    Coho

    GRECO
    Kaazing

    PRESLEY
    Pythian

    LOVELL-TROY
    Pythian

    ATCHISON
    New Relic

    BHUSVANE-
    SHWARI

    Microsoft

    GUCCIONE
    Keeper

    FLOREA
    Tintri

    KERBY
    BMC

    RAO
    Asurion

    BRENTON
    CollabNet

    SRINIVASAN
    Symantec

    GALBRAITH
    HPE

    NEWTON
    Alfresco

    ROGERS
    Anexia

    MORRISH
    Interoute

    TIFFANY
    SoftLayer

    OSTROVER-
    KHYI

    Mobidev

    LEWIS
    Formation
    Data

    KENDRICK
    Isomorphic

    REEVES
    Datical

    WALLER-
    STORFER

    Dynatrace

    CIOT
    Progress

    MOR
    Cloudyn

    LEFORT
    BMC

    ANDERSON
    BMC

    BRODY
    Webair

    NIELSEN
    Redis Labs

    SEHGAL
    Tintri

    MANN
    Splunk

    YOO
    Fuze

    COTY
    Alert Logic

    LANDA
    Kintone

    CHOQUETTE
    RackN

    MARTIN
    Security

    HURWITZ
    Hurwitz & Assc.

    Cloud Expo Silicon Valley All-Star Speakers Include

    KOWALL
    AppDynamics

    VAN TUIN
    Red Hat

    DEMMER
    Jut

    COHEN
    Netflix

    MUCHANDI
    Red Hat Inc

    BORELLO
    Sysdig

    GILPIN
    Conjur

    KANADE
    Harbinger
    Systems

    GORBACHEV
    Systems
    Services Inc.

    SUSSMAN
    Coalfire

    KHAN
    Solgeniakhela

    CHOKSI
    Harbinger
    Systems

    DE MENO
    CommVault

    BLOOMBERG
    Intellyx

    BUGWADIA
    Nirmata

    COTY
    Alert Logic

    FLETCHER
    Alert Logic

    CAUTHRON
    NIMBOXX

    LYNN
    AgilData

    WAGNER
    Cloudyn

    ANAND
    Appocito

    WEISS
    Pythian

    BRODY
    Webair

    JACKSON
    Softlayer

    DAVIDSON
    Juniper

    HOFFMAN
    Pivotal

    VERVAET
    HGST

    Murthy
    CloudRaxak

    FRANCISCO
    Evolute

    LETCHIN
    Nexenta

    SIMON
    JFrog

    BONIFAZI
    Solgenia

    WEISS
    ProfitBricks

    HOLT
    IBM

    HANNON
    SoftLayer

    GALLO
    SoftLayer

    SAYEGH
    Codero

    BIMMU
    Clouber

    SRINIVAS
    IBM

    CHAVAN
    IBM

    HEDGES
    Clouddata

    AHUJA
    Cloud

    MEINER
    Oracle

    SWARTZ
    Ericsson
    Cloud Expo New York All-Star Speakers Included

    DE SOUZA
    Cisco

    POTTER
    SafeLogic

    ROBINSON
    CompTIA

    WARUSA
    -WITHANA

    WSO2 Inc

    MEINER
    Oracle

    CHOU
    Microsoft

    HARRISON
    Tufin

    BRUNOZZI
    VMware

    KIM
    MapR

    KANE
    Dyn

    SICULAR
    Basho

    TURNER
    Cloudian

    KUMAR
    Liaison

    ADAMIAK
    Liaison

    KHAN
    Solgenia

    BONIFAZI
    Solgenia

    SUSSMAN
    Coalfire

    ISAACSON
    RMS

    LYNN
    CodeFutures

    HEABERLIN
    Windstream

    RAMA
    MURTHY

    Virtusa

    BOSTOCK
    IndependenceIT

    DE MENO
    CommVault

    GRILLI
    Adobe

    WILLIAMS
    Rancher Labs

    CRISWELL
    Alert Logic

    COTY
    Alert Logic

    JACOBS
    SingleHop

    MARAVEI
    Cisco

    JACKSON
    Softlayer

    SINGH
    IBM

    HAZARD
    Softlayer

    GALLO
    Softlayer

    TAMASKAR
    GENBAND

    SUBRA
    -MANIAN

    Emcien

    LEVESQUE
    Windstream

    IVANOV
    StorPool

    BLOOMBERG
    Intellyx

    BUDHANI
    Soha

    HATHAWAY
    IBM Watson

    TOLL
    ProfitBricks

    LANDRY
    Microsoft

    BEARFIELD
    Blue Box

    HERITAGE
    Akana

    PILUSO
    SIASMSP

    HOLT
    IBM Cloudant

    SHAN
    CTS

    PICCININNI
    EMC

    BRON-
    GERSMA

    Modulus

    PAIGE
    CenturyLink

    SABHIKHI
    Cognitive Scale

    MILLS
    Green House Data

    KATZEN
    CenturyLink

    SLOPER
    CenturyLink

    SRINIVAS
    EMC

    TALREJA
    Cisco

    GORBACHEV
    Systems Services Inc.

    COLLISON
    Apcera

    PRABHU
    OpenCrowd

    LYNN
    CodeFutures

    SWARTZ
    Ericsson

    MOSHENKO
    CoreOS

    BERMINGHAM
    SIOS

    WILLIS
    Stateless Networks

    MURPHY
    Gridstore

    KHABE
    Vicom

    NIKOLOV
    GetClouder

    DIETZE
    Windstream

    DALRYMPLE
    EnterpriseDB

    MAZZUCCO
    TierPoint

    RIVERA
    WHOA.com

    HERITAGE
    Akana

    SEYMOUR
    6fusion

    GIANNETTO
    Author

    CARTER
    IBM

    ROGERS
    Virtustream
    Cloud Expo Silicon Valley All-Star Speakers

    TESAR
    Microsoft

    MICKOS
    HP

    BHARGAVA
    Intel

    RILEY
    Riverbed

    DEVINE
    IBM

    ISAACSON
    CodeFutures

    LYNN
    HP

    HINKLE
    Citrix

    KHAN
    Solgenia

    SINGH
    Bigdata

    BEACH
    SendGrid

    BOSTOCK
    IndependenceIT

    DE SOUZA
    Cisco

    PATTATHIL
    Harbinger

    O'BRIEN
    Aria Systems

    BONIFAZI
    Solgenia

    BIANCO
    Solgenia

    PROCTOR
    NuoDB

    DUGGAL
    EnterpriseWeb

    TEGETHOFF
    Appcore

    BRUNOZZI
    VMware

    HICKENS
    Parasoft

    KLEBANOV
    Cisco

    PETERS
    Esri

    GOLDBERG
    Vormetric

    CUMBER-
    LAND

    Dimension

    ROSENDAHL
    Quantum

    LOOMIS
    Cloudant

    BRUNO
    StackIQ

    HANNON
    SoftLayer

    JACKSON
    SoftLayer

    HOCH
    Virtustream

    KAPADIA
    Seagate

    PAQUIN
    OnLive

    TSAI
    Innodisk

    BARRALL
    Connected Data

    SHIAH
    AgilePoint

    SEGIL
    Verizon

    PODURI
    Citrix

    COWIE
    Dyn

    RITTEN-
    HOUSE

    Cisco

    FALLOWS
    Kaazing

    THYKATTIL
    TimeWarner

    LEIDUCK
    SAP

    LYNN
    HP

    WAGSTAFF
    BSQUARE

    POLLACK
    AOL

    KAMARAJU
    Vormetric

    BARRY
    Catbird

    MENDEN-
    HALL

    SUPERNAP

    SHAN
    KEANE

    PLESE
    Verizon

    BARNUM
    Voxox

    TURNER
    Cloudian

    CALDERON
    Advanced Systems

    AGARWAL
    SOA Software

    LEE
    Quantum

    OBEROI
    Concurrent, Inc.

    HATEM
    Verizon

    GALEY
    Autodesk

    CAUTHRON
    NIMBOXX

    BARSOUM
    IBM

    GORDON
    1Plug

    LEWIS
    Verizon

    YEO
    OrionVM

    NAKAGAWA
    Transparent Cloud Computing

    SHIBATA
    Transparent Cloud Computing

    NATH
    GE

    GOKCEN
    GE

    STOICA
    Databricks

    TANKEL
    Pivotal Software



    Testimonials
    This week I had the pleasure of delivering the opening keynote at Cloud Expo New York. It was amazing to be back in the great city of New York with thousands of cloud enthusiasts eager to learn about the next step on their journey to embracing a cloud-first worldl."
    @SteveMar_Msft
    General Manager of Window Azure
     
    How does Cloud Expo do it every year? Another INCREDIBLE show - our heads are spinning - so fun and informative."
    @SOASoftwareInc
     
    Thank you @ThingsExpo for such a great event. All of the people we met over the past three days makes us confident IoT has a bright future."
    Yasser Khan
    CEO of @Cnnct2me
     
    One of the best conferences we have attended in a while. Great job, Cloud Expo team! Keep it going."

    @Peak_Ten


    Who Should Attend?
    Senior Technologists including CIOs, CTOs & Vps of Technology, Chief Systems Engineers, IT Directors and Managers, Network and Storage Managers, Enterprise Architects, Communications and Networking Specialists, Directors of Infrastructure.

    Business Executives including CEOs, CMOs, & CIOs , Presidents & SVPs, Directors of Business Development , Directors of IT Operations, Product and Purchasing Managers, IT Managers.

    Download Cloud Expo Show Guide
    Cloud Expo Show Guide
    Download PDF

    Join Us as a Media Partner - Together We Can Rock the IT World!
    SYS-CON Media has a flourishing Media Partner program in which mutually beneficial promotion and benefits are arranged between our own leading Enterprise IT portals and events and those of our partners.

    If you would like to participate, please provide us with details of your website/s and event/s or your organization and please include basic audience demographics as well as relevant metrics such as ave. page views per month.

    To get involved, email Lissette Mercado at lissette@sys-con.com.

    @CloudExpo Blogs
    While Artificial Intelligence (AI) may not be a new concept, its contribution to automation may just change the face of business. AI's conception dates as far back as 1950, when Alan Turing proposed the Turing test in order to evaluate a machine's ability to exhibit intelligent behavior. Fast-forward a couple of decades and research led to the creation of well-known theoretical tools such as Fuzzy Logic, Bayesian Networks, Markov Models and Neural Networks. Concurrently, new types of programming languages such as Prolog, LISP and Smalltalk set the scene for most of the modern interpreted langu...
    "Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
    In a recent post, titled “10 Surprising Facts About Cloud Computing and What It Really Is”, Zac Johnson highlighted some interesting facts about cloud computing in the SMB marketplace: Cloud Computing is up to 40 times more cost-effective for an SMB, compared to running its own IT system. 94% of SMBs have experienced security benefits in the cloud that they didn’t have with their on-premises service
    Gaining visibility in today’s sprawling cloud infrastructure is complex and laborious, involving drilling down into tools offered by various cloud services providers. Enterprise IT organizations need smarter and effective tools at their disposal in order to address this pertinent problem. Gaining a 360 - degree view of the cloud costs requires collection and analysis of the cost data across all cloud infrastructures used inside an enterprise.
    The high barrier to entry prevents many companies from tapping into the full potential of machine learning. But what if you could make it more accessible? We’re in the midst of a data explosion, with today’s enterprises amassing goldmines of information (25 quintillion bytes of data every day, according to some reports). But what exactly are they doing with this data? Considering the volume of data being collected is quickly becoming unmanageable, now is a good time to shift from manual machine learning to a cognitive approach. This enables businesses to better capitalize on their data and fa...
    Blockchain offers impeccable security with its cryptography-based decentralized system as well as the plethora of possible uses retailers could exploit in the near future. In a world of increasing cyberattacks, internet fraud and online hacking, blockchain comes as a breath of fresh air. With its encrypted data and decentralized network system, it's a thorn in every hacker's side. Generally being associated with the finance sector, blockchain is now taking retail by storm. It's on a course that will change the retail industry as we know it. But how exactly is it going to achieve such a feat?
    The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal today!
    SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone innovative products that help customers get more from their computer applications, databases and infras...
    The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Microservices being modular these are faster to change and enables an evolutionary architecture where systems can change, as the business needs change. Microservices can scale elastically and by being service oriented can enable APIs natively. Microservices also reduce implementation and release cycle time and enables continuous delivery. This paper provides a logical overview of the Microservices Reference Architecture that highlights various sub systems needed to support Microservic...
    Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of busin...
    Our work, both with clients and with tools, has lead us to wonder how it is that organizations are handling compliance issues in the cloud. The big cloud vendors offer compliance for their infrastructure, but the shared responsibility model requires that you take certain steps to meet compliance requirements. Which lead us to start poking around a little more. We wanted to get a picture of what was available, and how it was being used. There is a lot of fluidity in this space, as in all things cloud. The fact that DevOps Security plays into the cloud compliance model – particularly in dynamic ...
    A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, which can process our conversational commands and orchestrate the outcomes we request across our persona...
    The notion of improving operational efficiency is conspicuously absent from the healthcare debate - neither Obamacare nor the newly proposed GOP plan discusses the impact that a step-function improvement in efficiency could have on access to healthcare (through more capacity), quality of healthcare services (through reduced wait times for patients) or cost (through better utilization of scarce, expensive assets).
    In an attempt to put the patient first in healthcare, Congress and President Obama in 2015 approved a bipartisan bill for United States healthcare reform. The bill is known as “Medicare Access and CHIP Reauthorization Act of 2015,” or MACRA. Among the major provisions of MACRA is the Quality Payment Program. Under the Quality Payment Program, physicians and nurses receive positive, neutral or negative Medicare payment adjustments based upon a “Patient Satisfaction Score,” that is, patient satisfaction scores have a direct impact on how physicians, physician assistants, nurses, and hospital’s g...
    To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to show how companies develop, deploy, and dynamically update these applications and how this data-firs...
    The “Internet of Things” is an exciting area of tech, one in which industry experts estimate there will be more than 30 billion connected IoT devices by 2020. IoT is the inter-networking and instrumentation of physical devices – everything from streets, cars, factories, power grids, ice caps, satellites, and clothing to phones, microwaves, milk containers, planets, human bodies, etc. IoT creates an opportunity to measure, collect and analyze an ever-increasing variety of behavioral statistics. That being said, data, and more importantly insight into the data, is key for enhanced business val...
    Most of us understand that artificial intelligence (AI) offers opportunities for productivity improvements in the form of speed, automation, standardized actions and responses, plus the opportunity for continuous improvements via machine learning. These opportunities are enabled by data inputs that are analyzed and processed through AI algorithms that execute a desired decision and action. For all of the great capabilities and benefits that AI can provide, there is also a potential dark side. AI solutions can easily codify our prejudices, bias, gender stereotypes and promote injustices intenti...
    Gone are the days when application development was the daunting task of the highly skilled developers backed with strong IT skills, low code application development has democratized app development and empowered a new generation of citizen developers. There was a time when app development was in the domain of people with complex coding and technical skills. We called these people by various names like programmers, coders, techies, and they usually worked in a world oblivious of the everyday priorities of the business world. However, with the passage of time, this scenario is much more democr...
    It’s conference season and, as you might expect, Jason and I have been on the road covering a bunch of them. It’s always great to see what the disruptive players in the market are doing — and this year did not disappoint. But there is one thing that repeatedly happens that just gets under my skin: transformation-washing. As Jason explained in a Forbes article over a year ago, ‘washing’ is when a vendor (or pundit) applies a buzzword loosely in an overt attempt to attach themselves to its buzz. And transformation-washing is rampant.
    Networks have become large, complex entities that are increasingly difficult to manage and control. Security, audit, risk and compliance professionals know that their organizations rely on them for effective risk management, control and governance processes that are essential to the safety of their network environment. Yet compliance and security are more challenging than ever before as additional layers are added to this environment. One of the challenges lies in the fact that there is an ongoing, huge access gap in network security and compliance – and it has been residing within the enviro...