Quantcast
Channel: Blog – Talend Real-Time Open Source Data Integration Software
Viewing all 824 articles
Browse latest View live

Focus IT development on the user experience while improving the developer/designer relationship

$
0
0

Until recently, developing a user-friendly product rarely made the priority list of software makers. IT engineers were more concerned with the service to be rendered than they were with the usability[1] of their software. Packing a product with features was more important than making it easy to use. But at a time when IT products are a dime a dozen, we are now entitled to the same convenience and ease of use with business applications that we enjoy with mainstream products.

If software makers fail to consider usability, IT departments (their main customers) risk having their users turn to applications other than the ones they recommend.  This leads to the development of parallel computing models ("shadow IT") which are not certified by the IT department. The success of Tableau in the field of Business Intelligence is a perfect example.

 "User-Centered Design": Usability and Ownership

Software makers have now developed proven design methods to avoid this phenomenon and better identify user needs as well as potential constraints. User-centered design places the end user at the core of the development process (from needs analysis and the structure of the software to the development of more or less elaborate models). Nowadays, software developers can rely on user experience specialists to reach this goal.

Observe and Identify Key Use Cases

The first phase of the process is an on-site observation of users as they perform their tasks in everyday situations. This is the "field research" phase: studying how the user handles the application, how they interact with their coworkers and what constraints govern their daily work. For example, on an industrial assembly line, operators are often equipped with protective gloves, their keyboard is protected by plastic wrap and touch screens are more practical than using a mouse; it is essential that software makers consider this range of constraints to ensure that the design of new features is relevant.

Following this analysis, the UX designer composes descriptive lists detailing the profile of each user, as well as various characteristics such as the user's command of computer applications ("persona"). At the same time, with the help of the product manager, the designer completes a comprehensive analysis of use cases, i.e. the different needs that the software must meet. To prepare a data set before integrating it into a marketing campaign management tool, a business user spends an average of 80% of their time preparing data and only 20% analyzing the data. An application that can automate this process using change suggestions (capitalize the first letter of the name, separate the first and last name in a separate field, address correction, etc.) can help reverse this ratio and allows the user to devote more time to adding value.

Prototyping and Testing

The second phase—prototyping—is an essential step; it provides an interface based on the analysis carried out in the initial phase. An initial set of static prototypes are generated in the first round of analyzes, often in black and white, followed by a second series of prototypes that are more in-depth and detailed. In the majority of R&D departments, the work of the UX designer often stops there and the developer takes over to 'code' the interface. This phase of the development, which can be quite long, requires a great deal of attention to detail to ensure perfect restoration of the UX specifications provided by the designer. If this is not the case, bugs related to usability will be identified, but are rarely made a priority, and fixing them is often put off until later. This frustrates all those involved in the project - the developer is criticized, the designer is not heard and the end user does not get what they want.

In my view however, there is a simple solution that I have experienced myself - the UX designer must go further in the prototyping stage and offer animated models, delivered in HTML & CSS. This should be done using modern development frameworks such as Bootstrap.js or AngularJS, which are now broadly used by developers. This would greatly facilitate the work of developers, who could then simply concentrate on what they do best, i.e. data connection, data exchange, performance between front-end and back-end, etc. This would also present a functional, usable, and appealing version to decision makers and target users in the early stages, thus facilitating the adoption of the new software at all levels. Gone are the days of the more or less long 'tunnel' phase, during which assessing the progress of the project is difficult. Now, a ‘live’ version is available at all stages of the project.

Moreover, it facilitates the completion of the last phase of the "User-Centered Design" process, which consists of testing and evaluating developments. It becomes easy to assign each use case to the prototype/product in order to verify that it works correctly at each stage. The target user experiences actual operating conditions and says aloud what he is doing. Video cameras record the users and after all footage is consolidated, problems are identified and addressed on a regular basis.

The Best of Both Worlds

Previously, the two professions of IT engineer and UX designer were very different. While they present different challenges due to the nature of their tasks, new forms of collaboration must be found, such as the development of a joint 'front-end' component. To do this, it would be useful if in their respective courses of study, each student (developers and designers) be familiarized with the work of the other so that each party understands the general constraints of the project and ultimately, so that they can speed up development while responding more precisely to the demands of end users.

Faced with increased competition, tighter turnaround times designed to enable quicker market development response times and the more rapid delivery of applications consumed daily by our users, collaboration between UX designers and developers is becoming a strategic challenge for software makers.



[1] According to Wikipedia, usability corresponds to "the effectiveness, efficiency, and satisfaction with which specific users should be able to perform tasks"

 


Bootstrapping AWS CloudFormation Stacks with Puppet and Structured EC2 User Data

$
0
0

The purpose of this blog is to provide practical advice on how to approach application bootstrapping on AWS using CloudFormation, Puppet, Hiera and Serf. To obtain the most benefit from the blog, you should be technically familiar with these four technologies.

High level tasks include:

  1. Create the your CloudFormation template
  2. Install and configure software on the instances
  3. Connect instance to other instances in the stack 

Creating the CloudFormation template

Writing CloudFormation templates can be a painful task if you try to write them directly in JSON. Writing short and simple JSON structures by hand is fine; however describing your entire AWS infrastructure directly as JSON is not very practical for the following reasons: 

- Developer workflows problems – keeping large JSON files in source control means that everyone should use exactly the same parser and output formatting, otherwise diffs are completely unusable, which prevents code review.

- Very limited code modularity and re-usability.

- Depending on your parser of choice, syntax errors can be hard to find, especially in large documents. 

- Code without comments is never a good idea.

- Syntactically correct JSON isn't necessary semantically correct CloudFormation. Using a full programming language allows for testing at an earlier stage.

Instead of writing large JSON files directly, it's much easier to use full-featured programming language to generate the templates. 

We have selected Python and Troposphere (https://github.com/cloudtools/troposphere) for our CloudFormation generation.

In addition to describing your AWS resources, CloudFormation has the task of providing the instances' bootstrapping logic. This is done in Instance UserData. 

Let's have a look at some of the possibilities for UserData:

  1. Single shell script as UserData - Requires CloudInit, encoding shell script in JSON template
  2. YAML encoded CloudInit configuration - Requires CloudInit, encoding YAML in JSON template
  3. CloudFormation helper scripts - Generally requires Amazon Linux, May require encoding of shell scripts in the CloudFormation metadata resource
  4. JSON encoded UserData (our preferred option) - Requires custom AMI (Amazon machine image), since the logic which interprets the custom encoded JSON userdata must already exist on the AMI

Exploring Option 4

This option requires using custom AMIs, but that's actually not a new requirement. We use custom AMIs since we don't want to install software during the instance boot, which can cause failure and/or a general slow down during autoscaling. 

Since we are already building custom AMIs (using http://packer.io), why not install a start-up script that reads the structured UserData and passes the bootstrapping task to a configuration management tool? Configuration management tools are much better equipped for the tasks compared to shell scripts or cloud-init helper scripts. 

Install and configure the application on the instance

Using custom AMIs means that installation happens during AMI creation, while configuration happens during instance boot.

Since we are taking the approach of JSON encoded UserData we need something on the instances that understands this UserData and translates it into application configuration.

Take, for example, the following UserData:

{

"platform": {

"branch": "releases",

"dc": "aws-us-east-1",

"environment": "development",

"profile": "tipaas",

"release": "101",

"role": "webapp",

stack”: “development-testing”

},

"cloudformation": {

"resource_name": "WebAutoscalingGroup",

"stack_name": "rnd-IntegrationCloud-1VTIEDLMDO8YW-Web1a-1IY3WUHI0XCNN"

},

"webapp_config": {

"elastcache_endpoint": "dev1shared.qygjey.cfg.use1.cache.amazonaws.com:1121"

}

}

Now, back to the Python troposphere library. It's very easy to extend the troposphere library to provide a custom UserData Python class, which returns the above JSON when the final CloudFormation template is rendered.

What is left to do is translate the above JSON to a concrete Puppet catalog (remember – Puppet, Hiera and the Puppet modules are already installed on the AMI). 

Next steps are:

1) “facter-ize” all the platform variables

2) execute `puppet apply site.pp` where site.pp is an empty manifest containing only an empty default node and let Hiera provide all the classes and variables for the Puppet catalog compilation.

For example “/etc/rc.local”  looks like this:

#!/bin/bash

/usr/loca/sbin/ec2_userdata_facterize.py #reads ec2_userdata and creates facter facts for each variable in platform

/usr/bin/puppet apply site.pp

Snippet of hiera.yaml content (this is very short snippet of our actual hierarchy)

- "%{::t_profile}/role/%{::t_role}/dc/%{::t_dc}/env/%{::t_environment}/stack/%{::t_stack}"

- "%{::t_profile}/role/%{::t_role}/env/%{::t_environment}/stack/%{::t_stack}"

- "%{::t_profile}/role/%{::t_role}/env/%{::t_environment}/release/%{::t_release}"

- "%{::t_profile}/role/%{::t_role}/dc/%{::t_dc}/env/%{::t_environment}"

- "%{::t_profile}/role/%{::t_role}/env/%{::t_environment}"

- "%{::t_profile}/env/%{::t_environment}/stack/%{::t_stack}"

- "%{::t_profile}/env/%{::t_environment}/release/%{::t_release}"

- "%{::t_profile}/env/%{::t_environment}/variables"

Now we have successfully separated the CloudFormation development form the configuration development. Also, we are using the full potential of Hiera and Puppet for separating code from configuration variables.

But there’s more. We use the platform variables/facts and serf (http://serfdom.io) to connect to other instances of the same stack.

3. Connecting to other instances in the stack

(Note: This approach is only suitable for development environments or testing phases of CI pipelines.)

Now that we have our facter facts in place we use them to configure a serf agent on each instance of the stack. Serf is an agent for decentralized cluster membership (for more details see http://serfdom.io)

The agent is configured with a set of tags corresponding to the set of platform variables on our UserData. After the serf agent is configured and running we can use it to obtain information about other nodes in the stack.

Here is an example of output obtained by running serf:

#/usr/local/bin/serf members -status alive -format json

…..

{

"name": "ip-10-100-9-130",

"addr": "10.100.9.130:7946",

"port": 7946,

"tags": {

"t_branch": "releases",

"t_dc": "aws-us-east-1",

"t_environment": "development",

"t_profile": "tipaas",

"t_release": "101",

"t_role": "webapp",

"t_stack": "development-testing"

}

}

….

The output of the above command contains one such member definition for each instance in the stack. Now we have to make this information available to Puppet in an easy way. That's done again with Hiera and facter.

First we create set of custom facts – one for each profile+role, where the remaining platform variables (all but profile and role) match the same set of variables on the node where the custom facts are generated.

Example:

#facter | grep serf_my_

serf_my_activemq_broker => ip-10-100-2-21

serf_my_activemq_re => ip-10-100-49-114

serf_my_elk_elasticsearch => ip-10-100-41-79

serf_my_idm_syncope => ip-10-100-9-130

serf_my_mongo_repl_set_instance => ip-10-100-62-139

serf_my_repomgr_nexus => ip-10-100-51-250

serf_my_postgres_db => ip-10-100-20-245

serf_my_tipaas_rt_flow => ip-10-100-105-201

serf_my_tipaas_rt_infra => ip-10-100-36-145

serf_my_tipaas_webapp => ip-10-100-47-174

Now that we have those custom facts we can introduce them in Hiera in appropriate levels of the hierarchy

Example in a Hiera file:

tipaas/role/webapp/env/development.yaml

-

tipaas::activemq_nodes: "%{::serf_my_elk_elasticsearch}"

tipaas::mongo_nodes: "%{::serf_my_mongo_repl_set_instance}"

A Few Conclusions

Encoding shell scripts in CloudFormation templates is a valid approach, but using structured UserData provides better separation of concerns between the infrastructure code and configuration management.

Using Troposphere and Python to develop the CloudFormation allows for common developer workflows such as code reviews, local testing and inline documentation as part of the code.

Combining master-less Puppet and Hiera with Serf (serfdom.io) works really well for orchestrating development and integration environments.

 

Being a Data-Driven Retailer: What’s in it for You?

$
0
0

The short-answer to the above question is: ‘a lot.’ However, that’s not adequate to highlight the vital importance of adopting a data-driven approach to manage retail business activities ranging from supply chain management to pricing and demand planning. As such, this post will provide a detailed overview of the performance benefits retailers enjoy by laying the foundation to become truly data-driven.

Before we highlight all the reasons why you must become data-driven, first let’s define what we mean by the term. Data-driven retailers are organizations that exhibit a high-level of prowess in their data management activities, helping them convert multi-channel customer data into actionable insights. It’s these insights that allow them to ensure delivering seamless (consistent and personalized) messages across multiple channels, such as in-store, web, live chat, email and mobile applications.

Below is a chart from Aberdeen Group’s Data-Driven Retail study illustrating the wide performance gaps between savvy retailers with Best-in-Class data management practices and All Others — more immature retail organizations.

Figure 1: Effective Use of Data Powers Superior Results

Retailers that excel in converting data into insights reap the rewards of their efforts across three main categories:

Operational efficiency: This refers to organization’s ability to streamline their business activities to achieve desired outcomes with minimal errors, cost and effort. Performance metrics such as time-to-market of products and services as well as average order delivery time are among indicators of organizational success in operational excellence. Data-driven retailers enjoy significant annual improvements (decrease) in bringing products / services to the market, compared to All Others. Improvements in this metric validate an organization’s ability to use data to understand the bottlenecks and restrictions in providing customers with products / services, and using these insights to fine-tune existing activities.

Financial success: Operational excellence should be a core focus of all retailers, however it must be balanced by the pursuit to ensure financial health, as the ultimate success for firms is often correlated with its performance in driving top-line and bottom-line results. Effective use of data helps retailers enjoy an increase over twice greater in annual return on marketing investments (ROMI), compared to companies without Best-in-Class data management activities.  Using a data-driven approach to establish the right prices for the right products and ensuring item availability also helps savvy retailers enjoy far superior performance gains in product margins and return on inventory investments — both key measures tracked closely to ensure financial success.

Customer experiences: It’s important to note that the aforementioned results are closely linked with organization’s ability to address shopper needs and address them in a timely and relevant fashion. Companies that master making effective use of data once again excel in this area, and enjoy approximately three times greater annual increase in brand awareness, compared to All Others. This validates the benefits of utilizing data to identify unique consumer groups and target them with personalized messaging aimed at making them aware of the company brand as well as purchase its products / services.

Omer Minkara

Research Director, Contact Center & Customer Experience Management

Aberdeen Group

Like to learn more? Please read Aberdeen’s Data-Driven Retail study to learn more about the challenges faced by modern retailers and how effective data management activities pave the way for success.

 

About the author, Omer Minkara

Omer Minkara is the Research Director leading the Contact Center & Customer Experience Management research within Aberdeen Group.

In his research, Omer covers the Best-in-Class practices and emerging trends in the technologies and business processes used to enhance customer experience across multiple interaction channels (e.g. social, mobile, web, email and call center). Omer’s research is widely consumed by senior-level Customer Care, Marketing, Sales and Service executives. He has published numerous industry research papers, which are used by executives worldwide to build and nurture strategic customer engagement programs. Omer also speaks frequently with global decision makers to discuss their customer management activities.

Omer has a strong finance background with significant international experience. Prior to joining Aberdeen Group, he was an auditor at PricewaterhouseCoopers in the Europe region. Omer has an MBA degree from Babson College, where he participated in the launch of a technology company, creating a customer acquisition and engagement strategy, and developing all the operational and financial forecasts for the enterprise.

The Path to Optimize Retail Operations through Big Data

$
0
0

Running a retail business is no easy feat. There are plenty of unknowns and moving parts that change the dynamics of the business. Coupled with rising consumer expectations for better service, achieving operational excellence is not an option, but rather a necessity for modern retailers.

In our previous post we noted the benefits of becoming a data-driven retailer. We also highlighted that savvy organizations that excel in making effective use of data enjoy substantial improvements in key operational metrics such as average order delivery time and time to market of products / services.

Findings in Aberdeen Group’s Data-Driven Retail study revealed invaluable insights on achieving operational excellence. The common factor across all these insights is that retailers must streamline how they capture, manage and use operational and consumer insights in order to maximize their performance. The below figure illustrates several activities that help savvy retailers accomplish this goal.

Figure 1: Drill-down into Data to Tailor the Shopper Experience

A common mistake most retailers make is solely focusing on adopting the activities above, and overlooking the vital role data governance plays in using relevant and timely insights within these activities. For example, the above figure shows that retailers must utilize data from different shopper segments to identify the optimal price for each product — as well as the channels used to sell these items. However, in the absence of relevant and accurate data the pricing and customer targeting activities are prone to error, resulting in outcomes such as setting product prices lower than their value in the eyes of shoppers, and such missing out on additional revenue.

Retailers have access to a world of insights across multiple channels. Some of these are structured (e.g. web logs) and others are unstructured (e.g. customer-generated social media content such as a tweet). Successful businesses differentiate themselves through their ability to integrate these different types of data into a single view of the shopper. Doing so provides visibility into the products demanded by consumers, identify price sensitivity to different items as well as determine which e-commerce site content is more likely to convert website visitors into paying clients.

If you haven’t already laid the foundation to help your employees analyze more data gathered from more sources, we highly recommend you do so. It will require moving to a new big data processing paradigm so you can integrate all your data, operate in real-time and act with insight. This will help you realize the full potential of using the best practices highlighted in the Data-Driven Retail study.

Omer Minkara

Research Director, Contact Center & Customer Experience Management

Aberdeen Group

As noted in the above post by Aberdeen, Best-in-class retailers have implemented big data technologies where they correlate historical patterns with in-the-moment buying behavior. These sites can predict if a shopping cart is about to be abandoned, and take instant action to minimize this behavior such as offering new information, reviews or a price discount.  Product recommendation engines, such as those provided by Amazon, are excellent examples to increase customer loyalty as well as average purchase price.

Like to learn more? Please read Aberdeen’s Data-Driven Retail study to learn more about the challenges faced by modern retailers and how effective data management activities pave the way for success.

About the author, Omer Minkara

Omer Minkara is the Research Director leading the Contact Center & Customer Experience Management research within Aberdeen Group.

In his research, Omer covers the Best-in-Class practices and emerging trends in the technologies and business processes used to enhance customer experience across multiple interaction channels (e.g. social, mobile, web, email and call center). Omer’s research is widely consumed by senior-level Customer Care, Marketing, Sales and Service executives. He has published numerous industry research papers, which are used by executives worldwide to build and nurture strategic customer engagement programs. Omer also speaks frequently with global decision makers to discuss their customer management activities.

Omer has a strong finance background with significant international experience. Prior to joining Aberdeen Group, he was an auditor at PricewaterhouseCoopers in the Europe region. Omer has an MBA degree from Babson College, where he participated in the launch of a technology company, creating a customer acquisition and engagement strategy, and developing all the operational and financial forecasts for the enterprise.

 

The Role of Data Governance in Delivering Seamless Omni-Channel Experiences

$
0
0

Deploying the most powerful technology tools or building sophisticated processes help retailers maximize the shopper experience, drive operational excellence and ensure financial health. However, these results are all dependent on the quality of data used to drive business activities. Indeed, findings from Aberdeen Group’s Data-Driven Retail study shows that quality of data is the most pressing challenge influencing retailers’ data management activities — see the below figure.

Figure 1: Balance Quantity with Quality

Note: This question was asked as a multi-answer question, meaning that respondents were able to select multiple data management challenges impacting their activities.

Quality of data refers to the cleansing and profiling of the data captured across multiple channels (e.g. social media, web and mobile applications). Retailers capture myriad insights related to shopper activities through numerous channels, some of this data is structured and some is unstructured. Unless the data is cleansed and profiled, companies will struggle with establishing visibility into the consumer journeys through integrating data captured across multiple channels.

Savvy retailers are more than twice as likely as their peers to use technology tools such as database management to cleanse and profile customer and operational data. This helps them ensure that the data used to run analytics yields relevant results to guide shopper interactions as well as manage activities such as inventory management and pricing.

If you don’t currently have a data governance program designed to streamline your data management activities we highly recommend you to adopt one. This will help you ensure getting the maximum benefits out of your technology investments in areas such as omni-channel and analytics. It will also help you minimize business risks associated with using poor insights in guiding your strategic and tactical activities.

Omer Minkara

Research Director, Contact Center & Customer Experience Management

Aberdeen Group

There are myriad technology tools that help retailers maximize shopping experiences. See Talend’s related post highlighting these technology enablers. It’s important to note that the results from use of technology are only as good as the input used through these systems. Talend and Aberdeen highly recommend companies cleanse and profile customer data captured across multiple channels in order to improve data quality, and as a result, the accuracy of insights gleaned through use of analytics as a retailer. Watch the Talend webinar on ensuring data quality to learn more on this topic.

To learn more on how Best-in-Class retailers build and manage data governance programs, read Aberdeen’s Data-Driven Retail study.

About the author, Omer Minkara

Omer Minkara is the Research Director leading the Contact Center & Customer Experience Management research within Aberdeen Group.

In his research, Omer covers the Best-in-Class practices and emerging trends in the technologies and business processes used to enhance customer experience across multiple interaction channels (e.g. social, mobile, web, email and call center). Omer’s research is widely consumed by senior-level Customer Care, Marketing, Sales and Service executives. He has published numerous industry research papers, which are used by executives worldwide to build and nurture strategic customer engagement programs. Omer also speaks frequently with global decision makers to discuss their customer management activities.

Omer has a strong finance background with significant international experience. Prior to joining Aberdeen Group, he was an auditor at PricewaterhouseCoopers in the Europe region. Omer has an MBA degree from Babson College, where he participated in the launch of a technology company, creating a customer acquisition and engagement strategy, and developing all the operational and financial forecasts for the enterprise.

Survive and Thrive in a Data-Driven Future: Talend Hits the Big Apple at Strata and Hadoop World 2015!

$
0
0

Big Data Analytics have been advancing in the past years as the amount of information has exploded to petabytes and even EXABYTES of data. Hadoop is definitely the platform of choice for Big Data analysis and computation, however, as data Volume, Variety and Velocity increases, Hadoop as a batch processing framework cannot cope with the requirement for real time analytics.

Enter Spark: the engine for large-scale data processing written in Scala. In contrast to Hadoop's two-stage disk-based MapReduce paradigm, Spark's multi-stage in-memory primitives provides performance up to 100 times faster for certain applications. As a leading provider of Open Source data integration solutions, Talend is on the bleeding edge of what’s next for the future of big data analytics and we’re excited to share that with you! 

Strata + Hadoop World brings together the best minds in strategy, science, and industry for the defining event of the big data community to explore data driven use cases, IoT & Real-time, production ready Hadoop, Hadoop use cases and more! Talend executives, solutions experts and partners plan to take Strata by storm in order to join industry colleagues in dissecting case studies, delivering in-depth tutorials, sharing emerging best practices, and building the future of your business. Join us at the Jacob Javitz convention center to see how Talend is ‘awakening your data’ to the benefit and delight of all your customers!

Connecting the Data-Driven Enterprise

I don’t think there is more capable speaker on “what’s next” for Hadoop and big data technology than Talend’s Chief Marketing Officer, Ashley Stirrup. In his solutions theater presentation titled, “Using Spark: 5 Quick and Easy Ways to Get More out of your Data,” Ashley will highlight how a faster data integration solution can help you fundamentally transform the customer experience and increase revenue.

Using Spark: 5 Quick and Easy Ways to Get More out of your Data

Speaker: Ashley Stirrup, Chief Marketing Officer, Talend

Time: Wednesday, Sept. 30th at 11:30 AM EST (repeat session on Thursday at 1:45 PM EST)

Location: Solutions Showcase Theater

We also have a Spark demo that will be performed alongside our partners at Cloudera in booth #725 on Wednesday, Sept. 30 at 4:20 PM EST.

Take Advantage of Big Data Jedi Coaching and Enter our Star Wars Collectible Contest at Booth #846

Learn how Talend makes data processing and analysis light-years faster with Spark Real-Time Data Analysis. Talk to one of our Big Data Jedis for tips on how to overcome some of the most common Big Data challenges and explore various Spark use cases. Simply make an appointment here or drop by booth #846. And while you’re there, be sure to snap an in-booth selfie and post it to Twitter or Instagram, using the #Talend6Awakens#strataconf hashtags, and be entered to win a Star Wars collectible poster!!

If you can’t make it to the Javits Convention Center in New York City on September 30th – October 1st you can still enter for a chance to win Star Wars memorabilia by showing us your creative force by producing and posting a 15-30 second video or a custom meme explaining on how the Rebels or Empire could have used Real-time Data Analysis.  Submit your entry is with the ‪#‎Talend6Awakens hashtag through Twitter, Facebook or Instagram.

You can also follow all the show action through our social accounts @Talend, on our Talend for Big Data LinkedIn group, or on Facebook at facebook.com/talend. We’ll also have key news and insights from the Strata right here on the blog, so stay tuned….

Hope to see you in booth #846!

Real-Time Big Data About to Go Mainstream – Are You Ready?

$
0
0

Real-time big data analytics, already exhibiting rapid growth, is reaching an inflection point.  Now, for the first time, all the ingredients are coming together to form an integrated real-time big data supply chain that will transform how we do business.

This development couldn’t come at a more auspicious time.  According to IDC, we are caught up in a digital universe that is growing 40% a year, fueled not only by the online presence of people and enterprises, but also the rapid growth of the Internet of Things (IoT).  This digital universe is doubling in size every two years. By 2020, says IDC, it will reach 44 zettabytes – that’s 44 trillion gigabytes.

Until now the ability to mine this deluge of data in order to get answers that could be immediately acted upon was the domain of a handful of the world’s most sophisticated companies – massive organizations with really huge IT budgets and teams. Because of technical and financial constraints, small to medium sized enterprises have been largely sitting on the sidelines when it comes to applying real-time big data analytics.

Game Changer

With the introduction of Talend 6, the real-time analytic landscape has changed forever.  Talend 6 is the industry’s first cost-effective, data integration platform with native support for not only Hadoop, but Apache Spark and Spark Streaming as well.  By leveraging over 100 Spark components, the platform delivers the unmatched data processing speeds needed to convert streaming big data or IoT sensor information into actionable insights in real-time.

It’s powerful to know what your customers were doing last week, but it’s even more powerful to track their behavior as it happens and be able to respond immediately to transform your customer’s experience for the better.  There’s a Zen-like phrase that’s making the rounds that somewhat sums this up: “If it’s not in the moment, it’s meaningless.”

A real-time big data supply chain allows you to introduce innovations into your customer-facing solutions that were unimaginable before – a direct result of the significant performance gains that the new Talend 6 platform makes possible.

For example, for existing Talend customers the conversion of MapReduce jobs (the old way of doing things in Hadoop) to Spark are accomplished at the click of a button and result in immediate 5x performance increase.  Developer productivity is up 10x when compared to hand coding thanks to an intuitive design interface and prebuilt Spark components with automated Spark code generation. Talend 6 also provides a built-in Lambda architecture that provides a single environment for working with bulk and batch, real-time, streaming and IoT data.

More important however than the technical specifications of Talend 6, is what the platform powers for companies and the innumerable uses cases that are created when you can ask your data anything and receive an answer in an instant.

A Few Use Cases Highlighting the Real-Time Big Data Difference

Here are just a few examples of the power of real-time big data made possible by the Talend platform:

- Healthcare - Medical alert pendants, some with motion detectors, allow the elderly to connect directly with a dispatcher should they become incapacitated and unable to otherwise call for help.

Now, powered by real-time big data, a health service provider is able to constantly monitor at-risk patients. By combining real-time personal device data tracking vitals with medical record information, analytics tools can alert healthcare professionals if proactive patient action is required.

- Retail - Shopping cart abandonment — when shoppers put merchandise in an online cart, but leave before completing the purchase — is a significant challenge for retailers. According to BI Intelligence, a staggering $4 trillion worth of merchandise will be abandoned in shopping carts this year. Without real-time big data analytic capabilities, the only thing retailers are able to track is the extent of the loss.

Spark-powered data integration, coupled with Spark-enabled analytics, provides organizations the speed and agility needed to begin to tackle the issue of shopping card abandonment. With the ability to process real-time big data, companies can not only predict shopper behavior but also automatically deliver incentives to ensure shoppers complete their purchases.

- Agriculture - Historically, farmers would submit a physical soil sample to a service and weeks later receive an analysis telling them what actions to take to maximize a harvest.

Talend 6 support for Spark data integration and analytics allows services to correlate multiple sources of structured and unstructured data – data from the field combined with historical lab data – to deliver analysis and reporting within seconds. This allows farmers to make informed moment-by-moment management decisions.

Democratization of Real-Time Analytics

With this new release, Talend is providing the first real-time big data integration platform with the potential to totally transform how organizations of all sizes do business.  Real-time analytics capabilities are no longer just for the deep-pocketed few. 

It also means that you may be at a competitive disadvantage if you don't embrace real-time big data analytics using an integrated, cost-effective platform like Talend 6.

But the impact of this shift to real-time analytics has even further reaching implications.  Beyond the obvious advantages inherent in real-time marketing, other parts of the business – from manufacturing and supply chain management to human resources – can benefit as well. This is an excellent opportunity for IT to collaborate with other parts of the business to make the most of the real-time big data supply chain and explore innovative new ways to use this advanced technology. 

A new era has begun.

Etes-vous prêts à entrer dans l’ère du Big Data en temps réel ?

$
0
0

L'analytique en temps réel des Big Data connaissait une croissance ultra-rapide et s'apprête aujourd’hui à atteindre un point d'inflexion. Aujourd'hui – et pour la première fois – un certain nombre d'éléments se combinent pour définir un système intégré  d'« injection des big data en temps réel » qui s'apprête à transformer nos opérations.

Cette évolution ne pouvait pas tomber à un moment plus opportun. En effet, selon IDC, nous sommes emportés dans un tourbillon numérique qui connaît une croissance de 40 % par an ! Cette croissance est alimentée non seulement par la présence sur le Web d'entreprises et de particuliers de plus en plus nombreux, mais aussi par le développement rapide de l'Internet des Objets.  Cet univers numérique double en taille tous les deux ans : d'ici à 2020, prédit IDC, il atteindra 44 zettaoctets (Zo) – soit 44 milliards de Go !

Jusqu'à présent, les capacités nécessaires à l'analyse de ce tsunami de données en vue d'obtenir des réponses exploitables et rapides étaient l'apanage des sociétés les plus avancées du monde – des « big » structures avec des équipes et budgets IT vraiment considérables. En raison des contraintes techniques et budgétaires qui les caractérisent, les PME/TPE sont longtemps restées éloignées des solutions d'analyse des Big Data en temps réel.

La révolution Talend

Avec l'introduction de Talend 6, le paysage de l'analytique en temps réel va subir une transformation irrémédiable. Talend 6 est la première plate-forme d'intégration de données à coût modéré supportant non seulement Hadoop en natif, mais aussi Apache Spark et Spark Streaming.  En s'appuyant sur plus de 100 composants Spark, cette plate-forme propose les vitesses de traitement nécessaires pour convertir les flux Big Data ou de capteurs en connaissance exploitable immédiatement.

Il est certes très utile de savoir ce que vos clients ont fait la semaine dernière, mais rien n'égale le suivi de leur comportement en temps réel : l'analyse résultante vous permet de réagir instantanément afin d'améliorer ou de transformer leur expérience. La locution latine « maintenant ou jamais » résume bien cette situation.

Avec un système d'injection des Big Data en temps réel, vous pouvez introduire  dans vos solutions côté clients des innovations qui étaient inimaginables auparavant – conséquence directe des gains de performance considérables proposés par la nouvelle plate-forme Talend 6.

Par exemple, les clients existants de Talend peuvent convertir des jobs MapReduce en Spark en seulement quelques clics et bénéficier instantanément de performances cinq fois supérieures. Talend prévoit également que la nouvelle plate-forme va contribuer à multiplier par dix la productivité des développeurs par rapport au codage manuel, grâce à l’interface de conception intuitive conviviale, à des composants Spark prédéveloppés et à des fonctions de génération de code Spark automatisée. Talend 6 fournit également une nouvelle architecture Lambda, qui offre un environnement unique permettant d’exploiter des données issues de traitements bulk, batch, temps réel, streaming et de l’Internet des objets.

Au-delà de ses spécifications techniques, Talend 6 présente un avantage de taille : les innombrables cas d'usage qui se matérialisent dès que vous pouvez interroger vos données et recevoir une réponse en une fraction de seconde.

Cas d'usage : quand le temps réel fait la différence

Voici quelques exemples de la puissance des Big Data en temps réel rendus possibles par la plate-forme Talend :

- Santé - Des pendentifs d'alerte médicale (parfois dotés d'un détecteur de mouvement) permettent aux personnes âgées de contacter directement et automatiquement un service d'urgence si elles font une chute incapacitante ou/et qu'elles n'ont plus la possibilité d'appeler à l'aide par des moyens conventionnels.

Désormais, avec la puissance des Big Data en temps réel, un prestataire de services de santé peut surveiller en permanence les patients à risque. En combinant les données en temps réel enregistrées par cet objet connecté chargé de suivre les symptôme médicaux avec les informations présentes dans les dossiers médicaux, des outils d'analyse peuvent avertir des professionnels de santé si une action proactive est urgente pour le patient considéré.

- Grande distribution - Les abandons de panier (quand les visiteurs et clients d'un site sélectionnent des produits et les mettent dans un panier en ligne, puis quittent le site sans passer à l'achat) sont un défi important pour les sites marchands. Selon la société d’analyse BI Intelligence, les marchandises ainsi abandonnées devraient atteindre une valeur considérable en 2015 : plus de 3 milliards d’euros ! En l'absence de capacités d'analyse des Big Data en temps réel, le seul aspect que les commerçants peuvent suivre est... l'étendue de leurs pertes !

Une solution d’intégration de données dotée d’un support Spark combinée à des fonctions d’analyse avec Spark permet aux distributeurs de disposer des performances et de l'agilité nécessaires pour examiner leurs problèmes d'abandon de panier. Avec la capacité de synthétiser automatiquement leurs Big Data en temps réel, ces entreprises peuvent non seulement prédire le comportement de chaque client, mais aussi les encourager à finaliser leurs achats.

- Agriculture - Traditionnellement, les agriculteurs transmettent un échantillon de leur champ à un service spécialisé, et ils doivent ensuite patienter plusieurs semaines avant de recevoir une analyse suggérant les mesures à prendre pour optimiser leur récolte.

Talend 6 et son support d’intégration de données et d’analyses sur Spark permet de corréler plusieurs sources de données structurées et non structurées, par exemple les données recueillies sur le terrain et les données historiques des laboratoires, puis de générer en quelques secondes les analyses et rapports correspondants. Cette solution  permet aux agriculteurs de prendre à tout moment des décisions avisées et pleinement adaptées aux circonstances.

Démocratisation de l'analytique en temps réel

Avec cette nouvelle version, Talend propose la première plate-forme d'intégration en temps réel des Big Data, avec le potentiel de transformer considérablement les activités des entreprises de toutes tailles. Les analyses en temps réel ne sont plus uniquement réservées aux grandes entreprises.

Cela signifie également que vous pourriez rapidement vous retrouver dans une position concurrentielle défavorable si vous n'adoptez pas l'analytique en temps réel des Big Data en utilisant une plate-forme intégrée et peu coûteuse telle que Talend 6.

L'impact de cette transition vers l'analytique en temps réel se traduit par des implications encore plus profondes. Au-delà des avantages évidents apportés par le marketing en temps réel, les autres départements de l'entreprise – des ressources humaines à la production en passant par la chaîne logistique – peuvent également tirer parti d'une telle solution. C'est une excellente occasion pour votre département IT de collaborer avec d'autres services de l'entreprise de manière à leur faire profiter des avantages de l’analyse des Big Data en temps réel et à explorer des méthodes nouvelles et originales de mise en application de cette technologie avancée.

Nous entrons dans une nouvelle ère.


Echtzeit-Big Data werden Mainstream – sind Sie bereit?

$
0
0

Echtzeit-Big-Data-Analytik, die schon längst ein rapides Wachstum verzeichnete, gelangt jetzt an einen Wendepunkt. Zum ersten Mal vereinigen sich jetzt alle Bestandteile zu einer integrierten Echtzeit-Big-Data-Lieferkette, mit der die Geschäftsführung allgemein völlig verändert wird.

Es hätte kaum einen günstigeren Zeitpunkt für diese Entwicklung geben können. Laut IDC befinden wir uns mitten in einem digitalen Universum mit einem jährlichen Wachstum von 40 %, das nicht nur durch die Online-Präsenz von Nutzern und Unternehmen vorangetrieben wird, sondern auch durch das rapide Wachstum des Internets der Dinge (Internet of Things (IoT). Dieses digitale Universum verdoppelt alle zwei Jahre seine Größe. Bis 2020 wird es laut IDC einen Umfang von 44 Zettabyte erreichen, das sind 44 Billionen Gigabyte.

Bis jetzt war die Fähigkeit, diese riesige Masse an Daten zu nutzen, um Antworten zu erhalten, auf die hin sofort reagiert werden konnte, die Domäne einer Handvoll der weltweit erfahrensten und fortschrittlichsten Unternehmen: massive Konzerne mit riesigen IT-Budgets und Teams.  Aufgrund technischer und finanzieller Einschränkungen waren kleine bis mittelgroße Firmen bislang eher Nebendarsteller, wenn es um die Anwendung von Echtzeit-Big-Data-Analytik ging.

Das alles ändert sich jetzt

Mit der Einführung von Talend 6 hat sich der Bereich der Echtzeitanalytik radikal und permanent verändert. Talend 6 ist branchenweit die erste kosteneffektive Datenintegrations-Plattform mit nativer Unterstützung nicht nur von Hadoop, sondern auch von Apache Spark und Spark Streaming. Durch die Nutzung von mehr als 100 Spark-Komponenten bietet die Plattform einmalige Datenverarbeitungs-Geschwindigkeiten, die notwendig sind, um Streaming-Big Data oder IoT-Sensorinformationen in Echtzeit in aktionsfähige Einblicke umzuwandeln.

Es ist ein enormer Vorteil zu wissen, was Ihre Kunden letzte Woche gemacht haben. Was aber noch wichtiger ist: ihre Handlungen zu verfolgen, während sie geschehen, und sofort darauf reagieren zu können, um die Erfahrung Ihrer Kunden zu optimieren. Gerade macht ein wie ein zenbuddhistischer Grundsatz klingender Spruch die Runde, der dies hervorragend zum Ausdruck bringt: „Wenn es nicht in diesem Moment geschieht, ist es bedeutungslos.“

Eine Big-Data-Lieferkette in Echtzeit ermöglicht die Einführung von Innovationen in Ihre kundenseitigen Lösungen, die zuvor unvorstellbar waren. Das ist ein direktes Resultat der enormen Leistungssteigerungen, die mit der neuen Talend 6-Plattform möglich werden.

Für Bestandskunden von Talend erfolgt z. B. die Konvertierung von MapReduce-Aufträgen (die alte Vorgehensweise in Hadoop) zu Spark durch einfachen Tastenklick und führt sofort zu einer Leistungssteigerung um das Fünffache. Die Entwicklerproduktivität wird im Vergleich zur manuellen Programmierung dank einer intuitiven Designschnittstelle und vordefinierten Spark-Komponenten sowie automatisierter Spark-Code-Generierung sogar um das Zehnfache gesteigert. Talend 6 bietet auch eine integrierte Lambda-Architektur mit einer einzigen Umgebung für die Arbeit mit Bulk- und Batch-, Echtzeit-, Streaming- und IoT-Daten.

Noch wichtiger als die technischen Spezifikationen von Talend 6 aber sind der Nutzen der Plattform für Unternehmen und die unzähligen Anwendungsfälle, die entstehen, wenn Sie Ihre Daten beliebig abfragen und sofort eine Antwort erhalten.

Einige Anwendungsfälle, die den Echtzeit-Big-Data-Unterschied hervorheben

Hier nur ein paar Beispiele für die Leistung von Echtzeit-Big Data, wie die Talend-Plattform sie ermöglicht:

- Gesundheitswesen - Medizinische Notfall-Anhänger z. B. mit Bewegungsdetektoren verbinden alte Menschen sofort mit der Rettungsleitstelle, falls sie bewegungsunfähig werden sollten und nicht auf andere Weise um Hilfe rufen können.

Ein Anbieter von Gesundheitsleistungen kann jetzt dank Echtzeit-Big Data beständig den Zustand von Risikopatienten überwachen. Durch die Kombination von Echtzeitdaten persönlicher Geräte, die Vitaldaten verfolgen, mit Informationen in Krankenakten können Analytik-Tools jetzt Kliniker alarmieren, wenn eine proaktive Handlung am Patienten erforderlich wird.

- Einzelhandel - Aufgegebene Warenkörbe (wenn Käufer Ware in ihren Online-Warenkorb geben, eine Website aber verlassen, bevor der Kauf abgeschlossen wird) sind für Einzelhändler eine essenzielle Herausforderung. Laut BI Intelligence beläuft sich der Gesamtwert von in Warenkörben aufgegebener Ware dieses Jahr auf unglaubliche 4 Billionen US-Dollar. Ohne Echtzeit-Big-Data-Analysefunktionen können Einzelhändler lediglich das Ausmaß des Verlusts verfolgen.

Dank Datenintegration auf Spark-Basis sowie Spark-fähiger Analytik erhalten Unternehmen die nötige Geschwindigkeit und Agilität, um das Problem der Aufgabe von Warenkörben in den Griff bekommen zu können. Durch die Fähigkeit, Echtzeit-Big Data automatisch zu synthetisieren, können Unternehmen nicht nur das Käuferverhalten prognostizieren, sondern auch automatisch Anreize bieten, damit Verbraucher den Kauf auch wirklich zum Abschluss bringen.

- Landwirtschaft - In der Vergangenheit reichten Landwirte eine Bodenprobe bei einer zuständigen Stelle ein und erhielten dann Wochen später eine Analyse mit Empfehlungen, welche Maßnahmen getroffen werden können, um den Ertrag zu maximieren.

Dank Talend 6-Unterstützung von Spark-Datenintegration und Analytik können Fachdienste mehrere Quellen strukturierter und unstrukturierte Daten – aus dem Feld in Kombination mit historischen Labordaten – nutzen, um in wenigen Sekunden Analyseergebnisse und Berichte zu liefern. Dadurch können Landwirte informierte Entscheidungen zum Feldmanagement von einem Augenblick zum anderen treffen.

Demokratisierung der Echtzeit-Analytik

Mit diesem neuen Release bietet Talend die erste Echtzeit-Big-Data-Integrationsplattform, die das Potenzial hat, die Art und Weise, wie Unternehmen ihre Geschäfte führen, völlig zu verändern. Echtzeit-Analysefunktionen sind nicht mehr nur den wenigen Unternehmen mit tiefen Taschen vorbehalten.

Das bedeutet auch, dass Sie einen Wettbewerbsnachteil haben könnten, wenn Sie nicht Echtzeit-Big Data-Analytik mithilfe einer integrierten, kosteneffektiven Plattform wie Talend 6 nutzen.

Die Auswirkungen dieser Umwandlung auf die Echtzeitanalyse gehen noch erheblich weiter. Neben den offensichtlichen, systemeigenen Vorteilen des Echtzeit-Marketings können auch andere Geschäftsbereiche – von der Fertigung über die Lieferkettenverwaltung bis hin zum Personalwesen – profitieren. Hier bietet sich eine hervorragende Gelegenheit der Zusammenarbeit zwischen IT-Abteilung und anderen Bereichen des Unternehmens zur optimalen Nutzung der Echtzeit-Big-Data-Lieferkette und zur Erkundung innovativer neuer Möglichkeiten für die Verwendung dieser Technologie des Fortschritts.

Der Beginn einer neuen Ära.

You Can’t Fake the Data-Driven Force

$
0
0

Before I get into my blog – an admission – I’m a major Star Wars fan.

No, seriously – a MASSIVE fan. Back in the day, I was a light-saber-toting-wookie-lovin-Princess Leia-poster-on-the-wall type fan. Yup, that guy. In three months, the next Star Wars movie hits the streets and, as you can imagine, I can’t wait. While I still love the more recent one-through-three episodes of the franchise, my heart really belongs to the first three (episodes four, five, and six), which came out while I was still a young(ish) boy. 

The thing is, Vader had insight. He knew that he was Luke’s father… but Luke didn’t know. I remember watching the movie in the 80s and thinking “how does he not know? What is the point of the Force, if he can’t figure out some basic stuff? Come on!”

When it comes to being “data-driven”, I get the same feeling. “Come on! Surely you can do better than that!” I mean, what’s the point of all that data if you lack insight? While there are plenty of companies that are beginning to use their data, there is still only a handful that can fully exploit it. Those companies appear to know more about their customers than the customers know about themselves. They have insight. It’s instant, relevant, personal, possibly unnerving, but in the end, it’s about providing exceptional service.

You know the few companies that can do this today. It’s the likes of Google, Amazon and Netflix... They know what you are searching for before you finish typing, they automatically identify TV gems you would have never otherwise discovered, and they have made the process of ordering and receiving goods as easy as a single click of the mouse. In short, they deliver experiences that are instant, relevant, personal, and delightful.

Of course, they also have one other thing going for them – they could fill a football field with their IT talent and funding.

Clearly using day or week-old data to make business decisions and shape the customer experience is no longer going to cut it. Customers expect you to have a much better understanding of their needs and deliver a far more personalized experience. This however poses a wee bit of a challenge for the majority of companies that likely couldn’t fill a large office with their IT team and budget, let alone a football field.

Or at least it was a challenge until now - cue dramatic Star Wars opening theme music…

Talend 6 was introduced yesterday and became the FIRST integration product to be built on Apache Spark. This is really significant as it allows any company, regardless of the size of their IT budgets or teams, to handle real-time big data. This means you too can turn huge volumes of data into immediately actionable insights.

Cool right? It’s almost like we’ve handed you the Force. All we ask is that you use it wisely. And, unlike poor Luke, apply it to the really important stuff like revenue and your customer relationships.

P.S., shameless I know, but share my post for your chance to win a cool Star Wars inspired Tee that we created in celebration of getting Talend 6 out the door!


Unlocking the Power of the Cloud: Talend Teams Up with AWS at re:Invent 2015

$
0
0

Hot on the heels of Strata + Hadoop World NYC, next week Talend ships out to ‘Sin City’ for four jam-packed days at AWS re:Invent, “the largest gathering of the global Amazon Web Services community.”

We’re still basking in the glow of our new Talend 6 real-time big data integration platform, which is going to make a huge impact on big data cloud environments everywhere.

At re: Invent, our primary demo will team up AWS solutions with Talend Integration Cloud, which lets you can connect all your data in the cloud and on the ground. The solution includes over 900 connectors and components to simplify development of cloud-to-cloud and hybrid integration flows that can be deployed as governed integration services. At the show, you will be able to learn first-hand how to build simple or complex integration flows inside Talend Studio that connect, cleanse, and transform data. You might need to see it to believe it, but with a simple push a button, you can publish and go live in seconds!

Talend Integration Cloud for AWS also offers:

- The Best for Real-Time Big Data, Spark, Kinesis & Kafka

- Connections for all your data sources & applications, cloud and on-premises

- Business user self-service features to trigger agile innovation across the company

- Integration with Aurora, EMR, S3, RDS, Redshift; and,

- Real-time Big Data — Insight at a fraction of the cost with support for leading big data distributions and data sources

Awaken your Cloud Data and Win

Make an appointment here or drop by booth #630during AWS re:Invent at the Venetian Sands convention center in Las Vegas, October 6-9th.

While you’re there, be sure to snap an in-booth selfie and post it to Twitter or Instagram, using the #Talend6Awakens and #reInvent hashtags, and be entered to win a Star Wars collectible poster!! 

Win a David Prowse Autographed Darth Vader 8 x 10 Poster! 

Not attending? Good news! You can still enter for a chance to win Star Wars memorabilia by showing us your creative force by producing and posting a 15-30 second video or a custom meme explaining on how the Rebels or the Empire could have used Real-time Data Analysis. 

Be sure to follow all the show action through our social accounts, on Twitter and LinkedIn. We’ll also have key news and insights from the show right here on the blog, so stay tuned….

Why Driving a Data-Driven Culture is Essential to Business Success

$
0
0

Big data is a familiar term to everyone in the world of IT but now it’s becoming known as a topic in our everyday lives. Big data is forecast to continue impacting all aspects of our lives, especially at work.

For better or for worse, 90% of the world’s data was generated within the last two years.  But that data is only useful when anyone and everyone who needs to can access and understand it. This is why traditional business intelligence tools are being superseded by easy-access software solutions that don’t require initiation into some high priesthood of data science, or a PHD in statistics.

The rising tide of data has necessitated that everyone have familiarity with data analysis, not just experts with “analyst” in their titles. Organisations that make better use of data to make decisions are more successful, while those that don’t will begin to fall behind.

The democratisation of data has emerged as a consequence of several trends – proliferation of devices and the consumerisation of IT in general – and signs point to it becoming an ever more prevalent trend.  We are moving to the pervasive use of data, through online and real-world tracking and the internet of things.

In a world where people are drowning in data – from information on the Web, on spreadsheets, and in databases on tablets and devices – people need a lifeline, and that lifeline is data analytics.

A recent Teradata survey found that about 90% of organizations report medium to high levels of investment in data analytics solutions, and about a third call their investment “very significant.”

The study underlines this shift in thinking, as businesses see a return on their data-analytics investment across sectors and across areas of the business – from marketing to sales.

With increased data across various parts of today’s businesses, familiarity with data analysis is now an essential skill across roles and levels.

Unfortunately, most business analytics products are built to centralise and control data, not democratise it. As a result, the majority of companies are reliant on specialists just to answer basic questions. They stumble through Escher-like spreadsheets to work around inflexible business systems. Or they’re being stonewalled by enterprise-wide business intelligence platforms that spend more time in development than helping anyone.

There's no power in that approach. The power is in giving people the ability to think, act and deliver – and a self-service delivery means the IT department concentrates on their strategic role – not helping users work out how to generate reports!

When a company empowers employees with self-service analysis tools, they are shown to be capable and respected. People start to drive their organisations forward in ways that senior management could never anticipate. The environment fosters their ingenuity and creativity, and people are able to tell stories with their data.

Top tips for driving a data culture within your business:

- Get buy-in and excitement: think of data analysis as a story, and use a narrative

- Find the story first: explore the data

- Write out the story to guide your audience through the journey

- Supplement hard data with qualitative data, and add emotion

- Be visual: use pictures, graphs and charts

- Make It easy for your audience: stick to 2-3 key issues and how they relate to your audience

- Determine what you want people to do as a result: write your happy ending

- Encourage data uptake by demonstrating the benefits to the business and your colleague’s roles – data empowerment can make business heroes!

 

About the Author, Edouard Beaucourt (Tableau Software)


Edouard Beaucourt is Regional Director for France, French Speaking Switzerland and North Africa. Edouard Beaucourt is responsible for reinforcing and growing Tableau Software presence driving sales of Tableau’s products within the region. Edouard Beaucourt, 35 years old, joined Tableau Software in December 2013 as Enterprise Account Manager for France and French speaking Switzerland. Previously he fulfilled roles at IBM Business Analytics in Geneva, and at Clarity Systems, Paris and Geneva. He has a background in major account enterprise sales in the business intelligence and analytics software space and a wealth of industry insight. Prior to roles at the organisations above, Beaucourt also managed sales teams and channel partner programmes for Microsoft and Hyperion.

Self-Service and Data Governance Empowers LOB Users

$
0
0

There is a major transformation underway in the use of data centric tools within the enterprise. When it comes to working with solutions for data integration, data preparation, data analysis and Business Intelligence (BI), the emphasis is shifting from IT to line of business (LOB) users. 

This is a natural evolution.  LOBs today have to deal quickly and efficiently with constantly increasing amounts of data generated by multiple, digitized sources such as social media and the Internet of Things (IoT). It’s a situation that requires a collaborative relationship with IT as LOBs become more involved in data processing to rapidly obtain the trusted, updated data they need to facilitate decision making. 

Error ridden, unstructured data can have an immediate negative impact on business processes.  For example, customers dealing with a call center with inadequate or faulty customer tracking may have to restate their issue every time they call in to resolve a problem – a major annoyance. Or a corporate marketing department may optimistically launch a major campaign only to realize poor results due to relying on a database riddled with errors and omissions.  According to Gartner’s analyst Ted Friedman, organizations estimate losing on average $8.9 million per year due to those kinds of data quality issues.

However, even though business users want to be more involved in data-centric activities, they may be spending 70 to 80% of their time preparing data without any assurance the quality will be high or that governance risks such as privacy and compliance issues are being addressed. 

LOB users are partly responsible.  Data analysis tends to be an “individual sport” in which users often create their own version of the original data. It’s understandable that increasingly data driven LOBs want to participate in processing data relative to their business unit.  However, they must be accountable for the data they manage so that it can become a valuable asset for all parts of the organization, including IT. 

Consider the marketing department mentioned above which has created a web form on its website to capture leads for a new campaign. If the data input is not properly controlled, the campaign will introduce poor data into the CRM system including junk emails, invalid phone numbers, duplicate records, etc. This could finally impact the whole organization, not just marketing specific activities such as outbound campaigns. Think about a shipping notice or a response to a claim that could end-up as undeliverable mail.

A Collaborative Solution

For today’s organizations the solution is to transfer some data processing responsibilities to the LOB while allowing IT (or other cross functional organizations, such as the Office of Chief Data Officer that we see in some data-driven organizations) to keep control over the various processes involved. This is in line with the trends we have observed – a need from more autonomy on the part of the users, driving the use of self-service tools for data analysis, data preparation and data integration. However, this move to self-service can result in chaos if not accompanied by appropriate controls.

Data governance provides those controls – it not only allows the LOB to access the data but also ensures its quality and accuracy. Talend provides data governance capabilities to business users through the data stewardship features in Master Data Management solution.  And we are now creating a full vision of an LOB self service capability for our business customers that will allow them to cleanse, prepare and integrate data for their own analytical needs or from a file to an application or between applications.

Self-service also can empower LOBs to deal with data at an operational level. In that respect, a key part of the Talend solution is the Talend Integration Cloud, which allows “citizen integrators” – advanced users familiar with the underlying IT landscape such as  SaaS or commonly used on-premises business applications – to integrate data on an ad hoc basis or to collaborate with IT to create enterprise ready integration scenarios. 

Talend Integration Cloud is a secure cloud integration platform featuring powerful graphical tools and prebuilt components and connectors that make it simple to enrich and share data.  This makes data integration projects feasible for LOB users to tackle and therefore frees up IT for other, more strategic tasks. And it makes data Integration a team game between Lines of Business and IT, rather than a source of conflict.

Coming in the very near future based on a similar mindset is Talend Data Preparation, a self-service solution for business analysts, which will enable them to prepare data for analysis or any other data integration or stewardship tasks.  Data Preparation is being designed not only as a productivity tool for the LOB user, but as a collaborative tool that allows an organization to share most of its data assets.  The flexibility of this new solution will enable an organization to strike the right balance between LOB autonomy and IT control depending on the sensitivity of data involved, the organization’s culture, and the role that IT plays within the enterprise.

Benefits of a New Collaborative Approach

By recognizing the shift to LOB involvement in an organization’s data centric activities, the entire enterprise will realize numerous benefits:

- LOBs save time and increase productivity with easier sharing of information and a more comprehensive view of essential data

- Marketing organizations improve their campaigns through better targeting of customers (and more generally, lines of business can more successfully meet their operational objectives through a better use of their data assets)

- The enterprise gets better control over data including data security (such as avoiding major break-ins like the recent Sony/WikiLeaks hack – a company’s worst data nightmare mostly caused by uncontrolled copies of very sensitive data such as employees’ salaries and social security numbers)

- With LOBs controlling Big Data governance applications, the business users have the ability to use reliable, timely data for driving a competitive advantage.

Talend unified platform is Talend’s integrated solution for all the technical challenges of data governance, data integration and preparation together with data curation and stewardship.  This is an easy to use, unified data management platform with more than 1000 built-in connectors and components that make it easy to incorporate nearly any type of data source into your governance process.

With these capabilities in place, data quality has the potential to move from its traditional back office role to a far more proactive practice that can address quality issues before they even occur, and finally turn data into a day-to-day operational outcome across the lines of business. And, this approach is ushering in a new era of proactive, data aware LOB users who can work in close collaboration with their IT counterparts.

リアルタイムビックデータが主流に ー 準備できていますか?

$
0
0

急成長を遂げるリアルタイムビックデータ分析は、今日、大きな変換点を迎えています。あらゆるデータがリアルタイムビックデータのサプライチェーンを同時に形成出来るようになり、私たちのビジネスのやり方を変えることになるでしょう。

IDCによると、我々は、個人や企業のオンラインでの活動や物のインターネット(IoT)の急増により、年間40%成長しているデジタルワールドに生きていることになります。このデジタルワールドの大きさは隔年で倍増しています。2020年までには、デジタルワールドは44ゼタバイト、つまり44兆ギガバイトに達するとIDCは述べています。

これまで、この巨大なデータの山を掘り起こして、即時実行が可能な分析を行うためには、巨額のIT予算とスタッフを有する一握りの洗練された企業だけでした。その他の企業や組織は技術的、財政的制約により、リアルタイムビックデータ分析の実施は傍観してきたのです。

ゲームチェンジャー(大変革をもたらす者)

Talend 6 の登場により、リアルタイム分析の様相は一変しました。Talend 6 は業界で初めてHadoopのみならずApache SparkとSpark Streamingをネイティブサポートする、優れた費用対効果を発揮するデータ統合プラットフォームです。100以上のSparkコンポーネントを持つこのプラットフォームは、比類なきパフォーマンスでストリーミングデータやIoTセンサーコンテンツをリアルタイムにインサイトに変換します。

お客様の先週の行動を知ることは重要ですが、もっと重要なのは行動が起きた瞬間にそれを把握し、直ぐに対応することでお客様の満足度を向上することです。これを示唆するような禅の教えがあります「今ここに無いなら、それは無意味です」。

新バージョンTalend 6 は大幅にパフォーマンスを向上し、ビックデータのリアルタイムなサプライチェーンを実現することで顧客向けソリューションに革新をもたらすことが出来ます。

例えば、今Talendを使っているお客様は、ボタンをクリックするだけで(Hadoopの従来の処理方式である)MapReduceジョブをSparkジョブに変換することができ、処理性能を5倍に向上させることが出来ます。直感的なデザインインターフェースとSparkコードを自動生成するSparkコンポーネントにより、ハンドコーディングに比べ開発者の生産性は10倍に向上します。Talend 6 はまた、ラムダアーキテクチャによりバルクやバッチ処理、リアルタイム、ストリーミングおよびIoTデータを扱う統合環境を提供します。

しかし、Talend 6 の製品仕様よりも重要なのは、この製品が企業にもたらすメリットとデータに関する疑問に瞬時に答えることが出来る多数のユースケースです。

リアルタイムビックデータならではのユースケース

以下はTalendのプラットフォームが実現するリアルタイムビックデータ統合のほんの一例です:

  • ヘルスケア医療アラートペンダントは、モーション検出器と連動して、高齢な患者が身動きをとれなくなった時に、救護連絡を直接発信させることが出来ます。

今日、リアルタイムビックデータにより、ヘルスケアサービス企業は危険性の高い患者を常時監視することが出来ます。プロアクティブな患者対応が必要な場合は、バイタルサインを追跡する個人デバイスの発するデータとカルテ情報を組み合わせれば、医療従事者にアラートをあげることが出来ます。

  • 小売– ショッピングカート放棄率 — 購入者がオンラインカートに商品を入れても購入手続きに至らずに放棄してしまう率は、小売業者には大きな課題です。BIインテリジェスによると、ショッピングカートに放棄された商品額は今年4兆ドルにのぼると予想されています。リアルタイムビックデータ分析機能がなければ、小売業者は機会損失を防げません。

Spark対応のデータ統合はSpark対応の分析と連携して、ショッピンカート放棄の問題解決に必要とされる俊敏性を提供します。リアルタイムビックデータ処理により、顧客の購買行動を予測するだけなく、顧客に購入手続きを完了させるためのインセンティブを自動的に提供することが出来ます。

  • 農業 -従来、農業従事者は土壌サンプルの提供後、何週間も経ってから、収穫を最大化するために何をしたら良いかという分析結果を受けとっていました。

Talend 6 はSparkでのデータ統合をサポートしているため、従来のラボデータと結合したフィールドデータなど、構造化及び非構造化両方の様々なソースからデータを収集、分析し、一瞬でレポートを作成することが出来ます。これにより農業従事者は瞬間瞬間の情報に基づき判断を下すことが出来ます。

リアルタイム分析の普及

この新バージョンのリリースにより、Talend は世界初のリアルタイムビックデータ統合基盤を提供し、全ての企業や組織にビジネス手法の変革をもたらします。リアルタイム分析機能はもはや、ごく一部の限られたものの特権ではなくなりました。

また同時に、Talend 6 のような費用対効果の高い統合基盤を使ってリアルタイムビックデータ分析をしなければ、大きな競争上の不利になるということにもなります。

リアルタイム分析へのシフトは、さらなる意味を持っています。リアルタイムマーケティングに限らず、それ他の業務、製造やサプライチェーン管理から人事等にも利点があります。IT部門が他部門と連携してリアルタイムビックデータのサプライチェーンを最大限に活用し、この高度なテクノロジーを革新的な適用に生かす道を模索する絶好の機会です。

新時代の幕開けです。

You’ve Bought Into the Cloud: Now What?

$
0
0

For the last few years we’ve been hearing about the benefits of the cloud. And at this point, many of us agree the benefits are sizable and beneficial.  So, let’s say you’ve finally agreed that it’s time to move to the cloud – what’s next?

First off, “the cloud” can mean anything – so what is it exactly?  Applications like web analytics (Google Analytics) and customer relationship management (Salesforce) can be in the cloud. You can put data in the cloud in massive data warehouses like Amazon Redshift and Google BigQuery. You can move your analytics to the cloud with offerings like Tableau Online and Birst. And you can do any or all of the above, in any order.

Companies that weren’t “born in the cloud,” meaning any company more than a couple years old, need a plan for going cloud. Most organizations need to determine what, how and when to adopt cloud services.

Here are five strategies for transitioning to the cloud.

1. Use the right tool for the job.

If you’ve already decided to make a change, look at the toolkit of cloud services and see if there is a good option. Need a new data warehouse? An HR management solution? CRM? Consider starting with cloud there. You’ll likely have a faster implementation by going cloud, which means you’ll get value fast, and you’ll be able to start your transition without ripping out something that’s working. And chances are you’ll save money in the bargain.

For example, WildTangent, a worldwide distributor of mobile, social and online games, has moved most of his company’s data infrastructure to the cloud.  Scott Moran, Director of Business Intelligence encourages people to look at the many cloud services offered as a toolkit. Select the right tool at the right time, and choose a tool that’s the right size for the need.

Moran moved Wild Tangent’s data architecture to the cloud piece by piece. He cautions that, in keeping with the analogy of the toolbox, you don’t want to get all your tools out at once. Use one, finish the job, and move onto the next. This helps you keep your business running as you make the transition.

2. Be as flexible as the cloud itself.

The cloud is in a stage of rapid evolution. You have the possibility of prototyping as you go, and adding volume when you’ve got it right. Keep an eye on new technologies and see how you can fit them into your workflows. Your best architecture today may not be your best architecture in a year, or even six months. A bit of tweaking can save you a lot of money.

As you consider new services, take advantage of the flexibility in the cloud. Elasticity is a characteristic of many cloud service. Basically this means you can use (and pay for) a small amount at first and then scale dramatically when your concept is proven out. In the cloud, you can try things out without having to commit massive infrastructure or licensing costs up front.

3. Plan for growth.

One of the advantages of a cloud infrastructure is that you can scale up easily— as long as you’ve got the right infrastructure. Take the time upfront to get your systems working as you want them, whether they be cloud applications, data or analytics. You don’t want to change from a relational database to an analytical database mid stream, but you might want to double your analytic database capacity overnight.  And if your business starts growing, a good system can go a long way with you—but a bad one will only add to your headaches.

4. Give your users a hand (or at least a single sign-on solution!).

One of the challenges with moving to the cloud is that your users may end up with a number of different username and password combinations to remember. Luckily, there’s an app for that. Single Sign-On (SSO) solutions like OneLogin and others let your users use one password for many applications. This can significantly reduce user headaches and make users more open to adopting new solutions. It’s a good idea to favor solutions that use SAML or OAuth so that you can make use of an SSO solution when you’re ready.

5. Add even more value by broadening access to data.

If you’re building a data infrastructure in the cloud, think about how your employees will be able to use that data. If you’re moving to cloud applications, think about how you’ll integrate the data with other data in your enterprise.  Otherwise, the limiting factor in your cloud infrastructure will be the time of your data scientists.

These steps provide a great starting place for exploring the cloud.  You’ve bought into its potential, it’s now up to you to get the most out of it!

About the Author, Edouard Beaucourt (Tableau Software)


Edouard Beaucourt is Regional Director for France, French Speaking Switzerland and North Africa. Edouard Beaucourt is responsible for reinforcing and growing Tableau Software presence driving sales of Tableau’s products within the region. Edouard Beaucourt, 35 years old, joined Tableau Software in December 2013 as Enterprise Account Manager for France and French speaking Switzerland. Previously he fulfilled roles at IBM Business Analytics in Geneva, and at Clarity Systems, Paris and Geneva. He has a background in major account enterprise sales in the business intelligence and analytics software space and a wealth of industry insight. Prior to roles at the organisations above, Beaucourt also managed sales teams and channel partner programmes for Microsoft and Hyperion.


Building ‘Houses’ in the Cloud

$
0
0

In my IT career I have had the opportunity to work on many great Data Management projects, ranging from simple extract, transform and load (ETL) assignments that support operational systems like CRM, SFA, and ERP, to simple Data Warehouses.  I have been on some very impressive Master Data Management (MDM) and Data Quality projects for some of the top companies in their sectors, including both ETL and real-time Data Services integration patterns.  But, I have taken a break from that, and now work for a company that provides tools to help you build the very data fabric that all enterprises need to be successful.

Talend recently launched a new product in the integration Platform-as-a-Service (iPaaS) space that makes it even easier for customers to build and deploy their integration patterns in the cloud where infrastructure and hardware aren’t necessary.  This is a completely hosted Data Integration platform in the cloud, and if all your source and targets are in the cloud, then your entire solution can be hosted and run in the Cloud. 

As part of my new role at Talend, I am fortunate enough to have early access to many of the products and am required to become an early expert in order to train other technical professionals in the company.   Sometimes this can be a blessing and sometimes it’s a challenge, as I can be dragged into some really hairy projects.  In this case I was excited when our CMO, Ashley Stirrup, came to me and asked if I would help build our own internal cloud-based Customer Data Warehouse (CDW).  I was very excited to help build a complete data warehouse entirely in the cloud.

End-to-End Sales and Marketing Data Integration


The concept of the CDW was pretty simple really, the executives wanted to see and measure the effectiveness of all Marketing and Sales activities from beginning to end for all our customers.  The secondary project objective was to build the entire CDW using cloud technologies including the Talend's Integration Cloud Platform.  The three sources were, Marketo (Marketing Automation), Salesforce.com (Sales and Campaign Operations) and Netsuite (Billing and Invoicing)—all Software-as-as Service (SaaS) platforms.  We employed the assistance of our partner, full360, to build the Data Warehouse in Amazon Web Services (AWS) Redshift with the online edition of Tableau for the visualization layer.  The partner had a lot of experience with Talend's on-premise tools but like everyone was new to the Cloud edition.  It was my job to assist with the migration of the traditional Talend jobs to the cloud—a process which we referred to as: "Cloudify" the flows.

The process was very simple and it took next to no time to build using all the different components and connections Talend provides. We built the flows and tested the overall process from our local development studios.  This included a full batch control process within the Redshift tables to assure all extracts from the sources out to AWS S3 were successful before loading data to the production Reporting tables on Redshift.  We also used several Data Quality Actions. Actions are "Predefined Integration Patterns" used in a data flow in the cloud, to cleanse data quality issues.  Once these were defined, I saw many steps that were excellent candidates to be reused for Actions, such as the Batch control process that needed to retrieve a batch ID before every flow and then update a table at the end that the process was successful or report a failure.  I turned this into a ‘simple Action’ that all Flows together in sequence in order to keep the entire process in check.

The best part of this CDW process was that all my testing and production deployment was a matter of a "right click" and deploy and I was done!  I didn't have to call up my favorite hardware guys and order new integration servers or database servers because all the infrastructure was created for me in the cloud. The CDW process really is as simple as doing a right click and deploy to the cloud and I am ready to test, schedule, and run my integrations in production for my completely hosted Data Warehouse. 

Overall, building my first, completely cloud-hosted, Data Warehouse was a great experience!  Of course many of you still have data sources that are not in the cloud and you will need that on premise functionality that Talend Integration Cloud offers, but in this CDW project it was very fulfilling to have an entire project where I didn't need to involve the Infrastructure team or worry about securing space in some data center.  Finally, it’s important to note that as the requirements for the CDW grow, I know that AWS and Talend are both capable of scaling to meet the need with very little effort.

Final Data Warehouse Result


 

 



 

Three Key Takeaways from Amazon re:Invent 2015

$
0
0

Upon returning from AWS re:Invent, a jam-packed cloud computing frenzy in Las Vegas, I was a) pumped up about the impact that cloud environments can have for all of our customers and beyond; and b) left thinking about what key messages really stood out.

There was a lot to take in—but one thing was abundantly clear: AWS has done a remarkable job of making the move to public cloud seem inevitable: taking things like security off of the table, starting to remove some of the roadblocks around migration, and referencing a number of enormous companies who have either completed their migration or are making a major commitment.  It’s impressive. But what does all this mean for AWS, Talend, and our customers? Here are my key takeaways:

Data warehouses, analytics and IoT are all driving a continued migration to the cloud

Over two days of keynotes, executives from the following companies were among those who made presentations discussing their use of Amazon Web Services’ cloud: General Electric, Capital One, John Deere and BMW. If people were wondering before if enterprises are really using this platform, AWS re:Invent 2015 proved that they are.

The cloud is still in hyper-growth mode and small-to-medium sized companies are also using the cloud more than ever before—particularly as data stores continue to grow and the economics around managing these growing stores of data become unfeasible for most.

Migration made easy with Snowball

Given that, one perceived roadblock on the way to AWS is around the effort of moving existing data away from current servers, it is not surprising that they had a few migration offerings.  Snowball is a hardened disk appliance that allows you to physically ship up to 50TB of data via UPS (literally) as well as Database Migration Service (DMS) and a Schema Conversion Tool.  The DMS is a simple wizard that does a like for like bulk migration to or from a “legacy” database to something in AWS including data and code, and compressing along the way for performance.  They quote $3/TB to migrate.  The Schema Conversion Tool is an option that will attempt a best fit heterogeneous migration including data type and stored procedure migration (with some exceptions, I’m sure).

QuickSight and the need for CLEAN data

The announcement of AWS QuickSight was an important one. At face value, it’s a tool customers can use to analyze data they already have stored in AWS’s cloud using fancy graphics. Salesforce has taken the same approach with its analytics cloud. However, as the expression goes—‘Garbage in, garbage out.’ The success of QuickSight will all depend on having really clean data. That said, both Redshift and Aurora integrate with Talend, which makes it significantly easier, faster and cheaper to get all of your data into one repository and have high quality clean data for analytics so that you can have the best insights to infuse back into your business. Talend’s free Data Quality components would also play a critical role in getting the best data possible.

Symbolically, it also signals that AWS is an Infrastructure-as-a-service company – it provides virtual machines, storage, databases and a whole lot of other cool cloud-based infrastructure components. QuickSight is a Software-as-a-Service offering.

So what does all this mean for the future? Cloud is becoming the more obvious choice as the future of data warehouses. It’s more flexible, scalable, affordable and insured. And now, with these new offerings, AWS is making it even easier for customers to get into the cloud. All of these tools are being made more affordable and available so that companies looking to transform their organization into a data driven business have a fast, painless, and obvious path to take.

In addition to the myriad of messages targeted at CxOs, at re:Invent AWS also had the full attention of the world’s leading edge developers (both corporate & ISV) and it’s clear that we’re seeing the next winning multi-year platform franchise emerging.  AWS is (reportedly) already at a $7B annual revenue run rate, up 85% from last year (with their Q3 earnings expected to be announced today). Wow. All of this adds up to the fact that there is nothing but momentum in this space and Talend + AWS makes perfect sense.

Talend Connect: Step into the future of Big Data!

$
0
0

Save the date! Talend Connect will be back in Paris on November 18 and 19.

This year the Talend Connect event will take place over two days. The first day will be entirely devoted to Talend's partners. Over the course of the day we will showcase the new features of Talend 6, the first data integration platform on Spark. We will also present our new partnership programs. The second day is reserved for Talend customers and users. In addition to the presentation of Talend 6, visitors will be able to attend focused sessions on the latest innovations in Talend Integration Cloud and Talend MDM. Talend customers representing a variety of industries including Air France-KLM, Schneider Electric and m2ocity will take the stage to explain how they worked with Talend to revolutionize their business and make the most of their data. The attendees will also have the opportunity to connect with members of our executive team.

More information about the event and registration (in French)

Partner Day: November 18

The world of Talend partners – integrators and consulting firms, resellers, technology and OEM partners – continues to grow and play an essential role in the success of our customers' projects. We encourage ongoing discussions and collaboration with our partners at all times. This philosophy has driven Talend since its creation. Today, approximately 30% of the company's sales originate directly from partners.

We decided to dedicate an entire day to Talend's wide array of partners and to take stock of the various partner programs we offer. We’ll take the time to highlight the most significant achievements of the past year and offer our partners the opportunity to meet and engage exclusively with members of our Research & Development team.

Focus on Resellers: Specific attention will be given to Talend's new VAR (Value Added Reseller) program. Launched in January 2015, this program was customized to help resellers expand their relationships with their existing customers, acquire new customers and create new sources of recurring revenue. Talend Connect creates a platform for resellers to interact with the indirect sales teams who are responsible for the deployment of this program.

User Day: November 19

While the presentation of the latest features in Talend 6 will be the main attraction — in particular, the possibility of integrating Big Data in real time into the production environment —Talend Connect will give users a close-up demonstration of the entire range of Talend's new features and solutions.

Focus on Big Data: Talend Connect provides an ideal opportunity to discover the leading innovations and best operating practices of Big Data. Talend's activity in this area continues to be a major driver of growth for the company. After recording a 78% increase in sales in the first quarter of 2015 compared to the same quarter in 2014, Talend continued its growth during the first half of 2015, with the number of annual renewable contracts increasing by 66%. This increase was marked by a 92% hike in the number of new customers for Big Data solutions.

Talend has quickly established itself as the integration industry leader. This is demonstrated in how quickly Talend's solutions receive certification by leading Hadoop distributors – Cloudera, Hortonworks, MapR and AWS. Beyond Hadoop, Talend also benefits from strategic and technology partnerships with leaders like MongoDB, DataStax, Teradata and Vertica, to name a few.

Talend Data Master Awards

The winners of the first-ever Talend Data Masters Awards will be announced at Talend Connect. Talend Data Master Awards is a program designed to highlight and reward the most innovative uses of Talend solutions. The winners will be selected based on a range of criteria including market impact and innovation, project scale and complexity, Big Data technology usage and the overall business value achieved. Awards will be given across a number of different project categories including Master Data Management (MDM), Data Integration, Big Data, Enterprise Service Bus (ESB) and Data Quality.

Finalists are being selected from over 1,700 Talend customers worldwide representing the public and private sectors and working with market distribution, finance, health and industrial products.

Special Thanks to Our Supporters

Talend Connect benefits from the support of Keyrus, CGI, Business & Decision, EXL Group, Edis JEMS Group, Micropole, Microstrategy, MapR, Synaltic, Accenture, Smile, Cloudera and Sopra Steria, the sponsors of the 2015 Paris event.

I am very excited about welcoming users, customers and all the members of the community to the next French session of Talend Connect. Around 300 users and customers are expected this year, and we hope you'll be among them!

Talend Connect : Entrez dans le futur du Big Data !

$
0
0

A vos agendas ! Talend Connect est de retour à Paris, les 18 et 19 novembre prochains

Nouveauté : cette année la rencontre se déroulera sur deux jours. La première journée sera entièrement dédiée aux partenaires de Talend. Durant cette première journée, nous présenterons les nouveautés de Talend 6, la première plate-forme d’intégration de données sur Spark. Nous présenterons également les nouveautés de nos programmes de partenariat. Le jour suivant est réservé aux clients et utilisateurs de Talend. Outre la présentation de Talend 6, qui offre notamment des fonctions d’intégration de Big Data en temps réel, ils pourront assister à des présentations de Talend Integration Cloud et de Talend MDM, et découvriront les toutes dernières innovations produits. Des clients de Talend issus de différents secteurs d’activité, comme Air France-KLM, Schneider Electric et m2ocity, monteront sur scène pour expliquer dans quelle mesure Talend les a aidés à innover pour tirer le meilleur parti de leurs données. Les participants auront également l’opportunité de rencontrer et d’échanger avec les membres du management.

Plus d’informations sur l’événement et inscription

Journée Partenaires : 18 novembre

L’écosystème de partenaires de Talend – intégrateurs et sociétés de conseil, revendeurs, partenaires technologiques et OEM, – ne cesse de s’étoffer et joue en rôle essentiel dans la réussite des projets de nos clients. Nous favorisons des échanges permanents et une collaboration de tous les instants avec nos partenaires. C’est la philosophie qui anime Talend depuis sa création. Aujourd’hui, environ 30% du chiffre d’affaires de la société provient directement de son réseau de revendeurs.

Nous avons décidé de dédier une journée entière à l’écosystème de partenaires de Talend afin de faire le point sur les différents programmes de partenariat que nous proposons, de mettre en relief les réalisations les plus marquantes de l’année écoulée et d’offrir la possibilité à nos partenaires de rencontrer et dialoguer de manière privilégiée avec des membres de notre département de Recherche & Développement.

Focus Revendeurs : Une attention spécifique sera accordée au nouveau programme VAR (Value Added Reseller)  de Talend. Lancé en janvier 2015, ce programme a été bâti sur mesure pour permettre aux revendeurs d’étendre leurs relations avec leurs clients existants, d’en acquérir de nouveaux et de créer de nouvelles sources de revenus récurrents. Talend Connect est l’occasion de d’échanger directement avec les équipes commerciales indirectes chargées de l’animation de ce programme.

Journée Utilisateurs : 19 novembre

Si la présentation des nouvelles fonctionnalités de Talend v6.0 tiendra le haut de l’affiche – avec notamment la possibilité d’intégrer des Big Data en temps réel dans un environnement de production et l’intégration d’Apache Spark qui améliore drastiquement les performances des traitements des Big Data – Talend Connect est l’occasion de découvrir une démonstration de l’ensemble de la gamme de solutions de Talend.

Focus Big Data : Talend Connect offre une opportunité idéale pour découvrir les innovations phares et les bonnes pratiques d’exploitation des Big Data. L'activité de Talend dans ce domaine continue d'être un moteur de croissance majeur pour l'entreprise. Après avoir enregistré une augmentation de 78% de ses ventes au premier trimestre 2015 par rapport au même trimestre 2014, Talend a poursuivi sa croissance : au cours du premier semestre 2015, le nombre de contrats annuels reconductibles a augmenté de 66 %. Cette hausse est notamment marquée par une augmentation du nombre de nouveaux clients de 92 % pour ses solutions de Big Data.

Talend s’est rapidement imposé comme le leader du secteur. Ceci est notamment illustré par la rapidité avec laquelle les solutions de Talend obtiennent la certification des principaux éditeurs de distributions Hadoop – Cloudera, Hortonworks, MapR et AWS. Au-delà d’Hadoop, Talend bénéficie en outre de partenariats stratégiques ou technologiques avec MongoDB, DataStax, Teradata ou Vertica pour n’en citer que quelques-uns.

Talend Data Master Awards

Les vainqueurs des premiers Talend Data Masters Awards seront annoncés lors du Talend Connect. « Talend Data Master Awards » est un programme visant à mettre en relief et récompenser les utilisations les plus innovantes des solutions Talend. Les lauréats seront sélectionnés selon divers critères – l’impact sur leur secteur et l’innovation, la taille du projet et sa complexité, l’utilisation des technologies Big Data, ainsi que les résultats obtenus. Un prix sera remis pour différentes catégories de projet – gestion des données de référence (MDM), intégration de données, Big Data, Enterprise Service Bus (ESB) et qualité des données.

Les finalistes seront sélectionnés parmi plus de 1 700 clients de Talend à travers le monde, issus des secteurs public ou privé, évoluant sur les marchés de la distribution, de la finance, de la santé ou encore des produits industriels.

Un grand merci à nos sponsors

Talend Connect bénéficie du soutien de Keyrus, CGI, Business & Decision, EXL Group, Edis JEMS Group, Micropole, Microstrategy, MapR, Synaltic, Accenture, Smile, Cloudera et Sopra Steria, sponsors de l’édition Paris 2015.

Je suis très enthousiasmé et ravi de recevoir nos utilisateurs, clients et tous les membres de la communauté à la prochaine édition française du Talend Connect. 300 utilisateurs et clients sont attendus cette année, nous espérons vous compter parmi eux !

Our Sandbox has Better Toys

$
0
0

Today we launched a new real-time big data Sandbox. This is a super quick and painless way for you to gain first-hand experience with one of the latest big-data innovations, Apache Spark. For those of you not familiar with a Sandbox, we’re basically talking about a virtual development environment. Our virtual environment combines our latest data integration sensation, Talend 6, with some great ready-to-run, real-time data scenarios – plus a step-by-step Big Data Insights Cookbook. Within minutes you’ll be up and running and impressing your friends using Talend to turn data into real-time decisions with test examples that will get your feet wet on Apache Kafka, Spark, Spark Streaming and NoSQL.

Specifically, in the demo scenarios you’ll see a simple version of how to turn a website into an intelligent application. Experience building a Spark Recommendation Model using Spark machine learning and you’ll get to set up a new Kafka topic to help simulate live traffic coming from users browsing a web storefront. Most importantly, you will learn first-hand how you can take streaming data and turn it into real-time recommendations that will help dramatically improve sales.

So, whether you are a developer looking to sharpen your skills, a data architect or scientist looking to test some exciting new technologies for your company or a competitor looking to spy on Talend, we welcome you all with open arms; because, as they say, you have to play nice in the sandbox.

Remember, our Sandbox includes a 30-day evaluation of Talend 6, so the fun doesn’t have to stop at just the scenarios provided. Feel free to continue to use Talend 6 to test out some in-house projects; just don’t blame us if you get hooked.

Viewing all 824 articles
Browse latest View live