• 0
    0
    Image
    moeinkiller2005 Posted about 4 hours ago


    CAP Theorem specifies kind of an upper limit when designing distributed systems

    Three letters "C", "A" and "P" stands for (in order) "Consistency", "Availability" and "Partition Tolerance". OK, now what does this has any relevance with continuously emerging "Big Data"
    ...
    In most of the NoSQL data stores, they try to achieve both availability and partition tolerance together avoiding consistency. 
    • 0 Comment(s)
    • Send Comment
  • 0
    0
    Image
    moeinkiller2005 Posted about 15 hours ago


    The software company provides an enterprise version of Apache Hadoop, which is widely used for big data analytics process. Cloudera improved security, cloud management and its analytics database known as Impala 2.0.
    Cloudera's latest release includes platform collaboration with Intel to improve security as well as the following:
    Support for roles and edit permissions in Cloudera's open source user interface for Hadoop. Management of keys via its Navigator Key Trustee Server. Improvements to the auditing user interface, component coverage and extensions. MapReduce, used for batch processing, uses native tasks and is optimized for storage gear from EMC.Integration points with Cloudera's 1,200 partners. Cloudera Director, a self-service system to deploy Cloudera and support Amazon Web Services.An overhaul of Impala to improve concurrent workloads as well as SQL database functionality and stream processing.
  • 1
    0
    Image
    Reza Posted 2014-10-24 21:28:04Z
    In machine learning, support vector machines (SVMs, also support vector networks[1]) are supervised learning models with associated learning algorithms that analyze data and recognize patterns, used for classification and regression analysis. Given a set of training examples, each marked as belonging to one of two categories, an SVM training algorithm builds a model that assigns new examples into one category or the other, making it a non-probabilistic binary linear classifier. An SVM model is a representation of the examples as points in space, mapped so that the examples of the separate categories are divided by a clear gap that is as wide as possible. New examples are then mapped into that same space and predicted to belong to a category based on which side of the gap they fall on.
    • 0 Comment(s)
    • Send Comment
  • 1
    0
    Image
    sasan Posted 2014-10-24 19:04:41Z
    One of the most interesting and challenging parts of being a developer is staying on top of your skills. This includes evaluating emerging technologies, learning new techniques, and not falling prey to the constant influx of new things to learn. These 10 tips, based on different forms of learning and time commitment, will help you juggle it all.
    • 0 Comment(s)
    • Send Comment
  • 2
    0
    Image
    sasan Posted 2014-10-22 07:19:14Z
    Firebase is Joining Google!Over the past three years, we’ve gone from a crazy idea that ‘just might work’ to a proven product used by 110,000 developers. Today, I couldn’t be happier to announce that we’ve joined Google.

    Why?
    Two big reasons.

    First, Google’s backing allows us to dramatically scale Firebase. We’re proud of what we’ve built over the last three years, but we’ve only scratched the surface of the opportunity ahead of us. With Google’s engineering talent, resources, and technical infrastructure, we’ll be able to do much more, much faster.

    Second, our products and missions are highly complementary. Both the Firebase and Google Cloud Platform teams come to work each day for the same reason: to help developers create extraordinary experiences. By joining forces, Firebase developers will gain access to a powerful cloud infrastructure suite, and Cloud Platform customers will gain access to our rapid development capabilities. Together we’ll deliver a complete platform for mobile and web apps.

  • 2
    0
    Image
    sasan Posted 2014-10-21 02:38:55Z
    News is a direct tool of influence on public consciousness. The goal of the news is to provide to the society objective information about a particular event or action.In reality, news is often just retelling of events, passed through the prism of an author’s own subjective judgment, beliefs and attitudes. Authorship of the news inevitably leads to inconsistencies in the coverage of the same events.Moreover, the author often deliberately covers the events from his beneficial point of view, which he thereby can use to manipulate public opinion.


    For more info, Please refer to below reference
    • 0 Comment(s)
    • Send Comment
  • 0
    0
    Image
    moeinkiller2005 Posted 2014-10-19 19:20:24Z


    Hadoop is a wonderful creation, but it's evolving quickly and it can exhibit flaws. Here are my dozen downers...
    • 0 Comment(s)
    • Send Comment
  • 1
    0
    Image
    moeinkiller2005 Posted 2014-10-19 09:38:46Z


    Starred articles were potential candidates for our picture of the week published in our weekly digest. Enjoy our new selection of articles and resources (R, data science, Python, machine learning etc.) Comments are from Vincent Granville
    • 0 Comment(s)
    • Send Comment
  • 1
    0
    Image
    sasan Posted 2014-10-18 08:56:18Z
    Webmaster level: intermediate-advancedSubmitting sitemaps can be an important part of optimizing websites. Sitemaps enable search engines to discover all pages on a site and to download them quickly when they change. This blog post explains which fields in sitemaps are important, when to use XML sitemaps and RSS/Atom feeds, and how to optimize them for Google.Sitemaps and feedsSitemaps can be in XML sitemap, RSS, or Atom formats. The important difference between these formats is that XML sitemaps describe the whole set of URLs within a site, while RSS/Atom feeds describe recent changes. This has important implications:XML sitemaps are usually large; RSS/Atom feeds are small, containing only the most recent updates to your site.XML sitemaps are downloaded less frequently than RSS/Atom feeds.For optimal crawling, we recommend using both XML sitemaps and RSS/Atom feeds. XML sitemaps will give Google information about all of the pages on your site. RSS/Atom feeds will provide all updates on your site, helping Google to keep your content fresher in its index. Note that submitting sitemaps or feeds does not guarantee the indexing of those URLs.Example of an XML sitemap:<?xml version="1.0" encoding="utf-8"?> 
    <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"> 
     <url> 
       <loc>http://example.com/mypage</loc> 
       <lastmod>2011-06-27T19:34:00+01:00</lastmod> 
       <!-- optional additional tags --> 
     </url> 
     <url> 
       ... 
     </url> 
    </urlset>Example of an RSS feed:<?xml version="1.0" encoding="utf-8"?> 
    <rss> 
     <channel> 
       <!-- other tags --> 
       <item> 
         <!-- other tags --> 
         <link>http://example.com/mypage</link> 
         <pubDate>Mon, 27 Jun 2011 19:34:00 +0100</pubDate> 
       </item> 
       <item> 
         ... 
       </item> 
     </channel> 
    </rss>Example of an Atom feed:<?xml version="1.0" encoding="utf-8"?> 
    <feed xmlns="http://www.w3.org/2005/Atom"> 
     <!-- other tags --> 
     <entry> 
       <link href="http://example.com/mypage" /> 
       <updated>2011-06-27T19:34:00+01:00</updated> 
       <!-- other tags --> 
     </entry> 
     <entry> 
       ... 
     </entry> 
    </feed>“other tags” refer to both optional and required tags by their respective standards. We recommend that you specify the required tags for Atom/RSS as they will help you to appear on other properties that might use these feeds, in addition to Google Search.Best practicesImportant fieldsXML sitemaps and RSS/Atom feeds, in their core, are lists of URLs with metadata attached to them. The two most important pieces of information for Google are the URL itself and its last modification time:URLsURLs in XML sitemaps and RSS/Atom feeds should adhere to the following guidelines:Only include URLs that can be fetched by Googlebot. A common mistake is including URLs disallowed by robots.txt — which cannot be fetched by Googlebot, or including URLs of pages that don't exist.Only include canonical URLs. A common mistake is to include URLs of duplicate pages. This increases the load on your server without improving indexing.Last modification timeSpecify a last modification time for each URL in an XML sitemap and RSS/Atom feed. The last modification time should be the last time the content of the page changed meaningfully. If a change is meant to be visible in the search results, then the last modification time should be the time of this change.XML sitemap uses  <lastmod>RSS uses <pubDate>Atom uses <updated>Be sure to set or update last modification time correctly:Specify the time in the correct format: W3C Datetime for XML sitemaps, RFC3339 for Atom andRFC822 for RSS.Only update modification time when the content changed meaningfully.Don’t set the last modification time to the current time whenever the sitemap or feed is served.XML sitemapsXML sitemaps should contain URLs of all pages on your site. They are often large and update infrequently. Follow these guidelines:For a single XML sitemap: update it at least once a day (if your site changes regularly) and ping Google after you update it.For a set of XML sitemaps: maximize the number of URLs in each XML sitemap. The limit is 50,000 URLs or a maximum size of 10MB uncompressed, whichever is reached first. Ping Google for each updated XML sitemap (or once for the sitemap index, if that's used) every time it is updated. A common mistake is to put only a handful of URLs into each XML sitemap file, which usually makes it harder for Google to download all of these XML sitemaps in a reasonable time.RSS/AtomRSS/Atom feeds should convey recent updates of your site. They are usually small and updated frequently. For these feeds, we recommend:When a new page is added or an existing page meaningfully changed, add the URL and the modification time to the feed.In order for Google to not miss updates, the RSS/Atom feed should have all updates in it since at least the last time Google downloaded it. The best way to achieve this is by using PubSubHubbub. The hub will propagate the content of your feed to all interested parties (RSS readers, search engines, etc.) in the fastest and most efficient way possible.
    Generating both XML sitemaps and Atom/RSS feeds is a great way to optimize crawling of a site for Google and other search engines. The key information in these files is the canonical URL and the time of the last modification of pages within the website. Setting these properly, and notifying Google and other search engines through sitemaps pings and PubSubHubbub, will allow your website to be crawled optimally, and represented accordingly in search results.If you have any questions, feel free to post them here, or to join other webmasters in the webmaster help forum section on sitemaps.
    • 0 Comment(s)
    • Send Comment
  • 0
    0
    Image
    sasan Posted 2014-10-18 08:54:39Z
    Today we are publishing details of a vulnerability in the design of SSL version 3.0. This vulnerability allows the plaintext of secure connections to be calculated by a network attacker. I discovered this issue in collaboration with Thai Duong and Krzysztof Kotowicz (also Googlers).

    SSL 3.0 is nearly 18 years old, but support for it remains widespread. Most importantly, nearly all browsers support it and, in order to work around bugs in HTTPS servers, browsers will retry failed connections with older protocol versions, including SSL 3.0. Because a network attacker can cause connection failures, they can trigger the use of SSL 3.0 and then exploit this issue.

    Disabling SSL 3.0 support, or CBC-mode ciphers with SSL 3.0, is sufficient to mitigate this issue, but presents significant compatibility problems, even today. Therefore our recommended response is to supportTLS_FALLBACK_SCSV. This is a mechanism that solves the problems caused by retrying failed connections and thus prevents attackers from inducing browsers to use SSL 3.0. It also prevents downgrades from TLS 1.2 to 1.1 or 1.0 and so may help prevent future attacks.

    Google Chrome and our servers have supported TLS_FALLBACK_SCSV since February and thus we have good evidence that it can be used without compatibility problems. Additionally, Google Chrome will begin testing changes today that disable the fallback to SSL 3.0. This change will break some sites and those sites will need to be updated quickly.

    In the coming months, we hope to remove support for SSL 3.0 completely from our client products. 

    Thank you to all the people who helped review and discuss responses to this issue.
    • 0 Comment(s)
    • Send Comment
More ...