Search Results

Search found 31448 results on 1258 pages for 'google analytics api'.

Page 78/1258 | < Previous Page | 74 75 76 77 78 79 80 81 82 83 84 85  | Next Page >

  • Google App Engine datastore encoding?

    - by sernaferna
    I'm using the GAE datastore for a Java application, and storing some text that will be in numerous languages. In my servlet, I'm first checking to see if there's any data in the data store, and, if not, I'm creating some, similar to the following: ArrayList<Lang> list = new ArrayList<Lang>(); list.add(new Lang("EN", "English", 1)); list.add(new Lang("ES", "Español", 0)); //more languages here... PersistenceManager pm = PMF.get().getPersistenceManager(); for(Lang l : list) { pm.makePersistent(l); } Since this is using JDO, I guess I should include the relevent parts of the Lang class too: @PersistenceCapable public class Lang { @PrimaryKey private String code; @Persistent private String name; @Persistent private int popularity; // getters & setters & constructors... } However, the non-ASCII characters are giving me grief. I've set my Eclipse project to use the UTF-8 encoding instead of the default Cp1252, so I think I'm okay from that perspective, but when I use the App Engine Data Viewer to look at my data, that Español entry becomes Español, and when I click on it to view it, I get a 500 Server Error. (There are some other entries with right-to-left text that don't even show up in the Data Viewer at all, but one problem at a time...) Is there anything special I can do in my code to set the character encoding, or specify to GAE that the data I'm storing is UTF-8? Or is the problem on the Eclipse side, and is there something I should be doing with my Java code?

    Read the article

  • Google App-Engine Java Batch Update

    - by Manjoor
    I need to upload a .csv file and save the records in bigtable. My application successfully parse 200 the records in the csv files and save to table. Here is my code to save the data. for (int i=0;i<lines.length -1;i++) //lines hold total records in csv file { String line = lines[i]; //The record have 3 columns integer,integer,Text if(line.length() > 15) { int n = line.indexOf(","); if (n>0) { int ID = lInteger.parseInt(ine.substring(0,n)); int n1 = line.indexOf(",", n + 2); if(n1 > n) { int Col1 = Integer.parseInt(line.substring(n + 1, n1)); String Col2 = line.substring(n1 + 1); myTable uu = new myTable(); uu.setId(ID); uu.setCol1(MobNo); Text t = new Text(Col2); uu.setCol2(t); PersistenceManager pm = PMF.get().getPersistenceManager(); pm.makePersistent(uu); pm.close(); } } } } But when no of records grow it gives timeout error. The csv file may have upto 800 records. Is it possible to do that in App-Engine? (something like batch update)

    Read the article

  • Change|Assign parent for the Model instance on Google App Engine Datastore

    - by Vladimir Prudnikov
    Is it possible to change or assign new parent to the Model instance that already in datastore? For example I need something like this task = db.get(db.Key(task_key)) project = db.get(db.Key(project_key)) task.parent = project task.put() but it doesn't works this way because task.parent is built-in method. I was thinking about creating a new Key instance for the task but there is no way to change key as well. Any thoughts?

    Read the article

  • innerHTML parse for Google Chrome?

    - by user554095
    Hey all. I am adding text to a textArea via Javascript in Chrome (doing it a totally different way in Firefox/IE as Chrome does not support contentWindows). Here is my code: document.getElementById("vB_Editor_001_textarea").value += '<font color="white"><b>' + numberLabel + ':</b> ' + armory + '</font><br/>'; This HTML is getting put directly into the textArea without being parsed. How can I make it parse into regular text with the HTML markup when it hits the textArea? Thanks!

    Read the article

  • NoSQL Memcached API for MySQL: Latest Updates

    - by Mat Keep
    With data volumes exploding, it is vital to be able to ingest and query data at high speed. For this reason, MySQL has implemented NoSQL interfaces directly to the InnoDB and MySQL Cluster (NDB) storage engines, which bypass the SQL layer completely. Without SQL parsing and optimization, Key-Value data can be written directly to MySQL tables up to 9x faster, while maintaining ACID guarantees. In addition, users can continue to run complex queries with SQL across the same data set, providing real-time analytics to the business or anonymizing sensitive data before loading to big data platforms such as Hadoop, while still maintaining all of the advantages of their existing relational database infrastructure. This and more is discussed in the latest Guide to MySQL and NoSQL where you can learn more about using the APIs to scale new generations of web, cloud, mobile and social applications on the world's most widely deployed open source database The native Memcached API is part of the MySQL 5.6 Release Candidate, and is already available in the GA release of MySQL Cluster. By using the ubiquitous Memcached API for writing and reading data, developers can preserve their investments in Memcached infrastructure by re-using existing Memcached clients, while also eliminating the need for application changes. Speed, when combined with flexibility, is essential in the world of growing data volumes and variability. Complementing NoSQL access, support for on-line DDL (Data Definition Language) operations in MySQL 5.6 and MySQL Cluster enables DevOps teams to dynamically update their database schema to accommodate rapidly changing requirements, such as the need to capture additional data generated by their applications. These changes can be made without database downtime. Using the Memcached interface, developers do not need to define a schema at all when using MySQL Cluster. Lets look a little more closely at the Memcached implementations for both InnoDB and MySQL Cluster. Memcached Implementation for InnoDB The Memcached API for InnoDB is previewed as part of the MySQL 5.6 Release Candidate. As illustrated in the following figure, Memcached for InnoDB is implemented via a Memcached daemon plug-in to the mysqld process, with the Memcached protocol mapped to the native InnoDB API. Figure 1: Memcached API Implementation for InnoDB With the Memcached daemon running in the same process space, users get very low latency access to their data while also leveraging the scalability enhancements delivered with InnoDB and a simple deployment and management model. Multiple web / application servers can remotely access the Memcached / InnoDB server to get direct access to a shared data set. With simultaneous SQL access, users can maintain all the advanced functionality offered by InnoDB including support for Foreign Keys, XA transactions and complex JOIN operations. Benchmarks demonstrate that the NoSQL Memcached API for InnoDB delivers up to 9x higher performance than the SQL interface when inserting new key/value pairs, with a single low-end commodity server supporting nearly 70,000 Transactions per Second. Figure 2: Over 9x Faster INSERT Operations The delivered performance demonstrates MySQL with the native Memcached NoSQL interface is well suited for high-speed inserts with the added assurance of transactional guarantees. You can check out the latest Memcached / InnoDB developments and benchmarks here You can learn how to configure the Memcached API for InnoDB here Memcached Implementation for MySQL Cluster Memcached API support for MySQL Cluster was introduced with General Availability (GA) of the 7.2 release, and joins an extensive range of NoSQL interfaces that are already available for MySQL Cluster Like Memcached, MySQL Cluster provides a distributed hash table with in-memory performance. MySQL Cluster extends Memcached functionality by adding support for write-intensive workloads, a full relational model with ACID compliance (including persistence), rich query support, auto-sharding and 99.999% availability, with extensive management and monitoring capabilities. All writes are committed directly to MySQL Cluster, eliminating cache invalidation and the overhead of data consistency checking to ensure complete synchronization between the database and cache. Figure 3: Memcached API Implementation with MySQL Cluster Implementation is simple: 1. The application sends reads and writes to the Memcached process (using the standard Memcached API). 2. This invokes the Memcached Driver for NDB (which is part of the same process) 3. The NDB API is called, providing for very quick access to the data held in MySQL Cluster’s data nodes. The solution has been designed to be very flexible, allowing the application architect to find a configuration that best fits their needs. It is possible to co-locate the Memcached API in either the data nodes or application nodes, or alternatively within a dedicated Memcached layer. The benefit of this flexible approach to deployment is that users can configure behavior on a per-key-prefix basis (through tables in MySQL Cluster) and the application doesn’t have to care – it just uses the Memcached API and relies on the software to store data in the right place(s) and to keep everything synchronized. Using Memcached for Schema-less Data By default, every Key / Value is written to the same table with each Key / Value pair stored in a single row – thus allowing schema-less data storage. Alternatively, the developer can define a key-prefix so that each value is linked to a pre-defined column in a specific table. Of course if the application needs to access the same data through SQL then developers can map key prefixes to existing table columns, enabling Memcached access to schema-structured data already stored in MySQL Cluster. Conclusion Download the Guide to MySQL and NoSQL to learn more about NoSQL APIs and how you can use them to scale new generations of web, cloud, mobile and social applications on the world's most widely deployed open source database See how to build a social app with MySQL Cluster and the Memcached API from our on-demand webinar or take a look at the docs Don't hesitate to use the comments section below for any questions you may have 

    Read the article

  • Custom Themes Introduced in Gmail

    - by Rekha
    As we all know, Google Team introduced a number of HD themes last November. Now they are giving us an option of customizing our own background. We can put in our own images, select from our Google+ photos or just paste any image URL. Or we can browse the Featured Photos section to find the image that we like. They are introducing custom themes with two options, Light and Dark. In the Featured tab, we can simply search for specific kind of pictures like “hdr scenery” or “bokeh wallpaper” and so on. We can easily maintain our personal or work accounts with different background images that suites the best. The company announced this information in their blog today.

    Read the article

  • Unindexing my tumblr blogs content and moving it to another tumblr blog

    - by sam
    ive been writing a tumblr blog for the past yr or so, ive writen about 300 articles, but now i need to move the blog to another site. (before it was running under blog.mysite.com and i now want it to run under blog.my*new*site.com) I want to keep the archived articles and have them on the new site, so what i was hoping to do was export the blog from tumblr, go into webmaster tools remove all the blogs indexed urls from google webmaster, then make a new tumblr blog and import the posts. Would google see this as new content as ive deleted their indexed copy ? Could i just move the mapping of the tumblr blog to the new subdomain, but in doing this i would lose all the pr and it would still look like duplicate content whats the best way to approach this ?

    Read the article

  • 410 Responses when your CMS host doesn't support them?

    - by leeand00
    Sending a 410 responses for a page that no longer exist should make Google stop crawling for that page. The site I am working on has been recently migrated, and very little of the content was migrated. I've already turned the existing content into 301 redirects (the content that is on both the old and the new site), but now I would like to flush the old content from Google's memory by placing 410 responses in it's path when it returns to crawl for them and finds a 404 response. However, I asked our CMS host about it, and they said that our CMS does not support 410 responses. Is there some other way to post a 410 response, like making a dead link 301 redirect to a page that a 410 response in the form of a meta tag?

    Read the article

  • Enhance Your Gmail Account in Chrome

    - by Asian Angel
    Are you tired of items like the Chat and Invite Boxes cluttering up your Gmail account? Then join us as we look at the Better Gmail extension for Google Chrome. Before Here are some examples of items that you may be tired of looking at in your Gmail account such as the “Footer” below your “Inbox”, the “Chat Box”, and the “Invitation Box”. Perhaps you would also like to have the “New Window, Print all, & Create a document Commands” moved elsewhere. And of course there is everyone’s “favorite” sponsored links… Time to do some cleaning up and reorganizing. Better Gmail in Action As soon as you have installed Better Gmail a new tab will automatically open and present you with the available options. Place a “checkmark” in the box for each option that you would like activated and click on “Save” when finished. Note: The final option entry is a tie-in with two other “linked” extensions (Folders4Gmail & HTML Signature) while the middle listing is a link to an article for disabling Google Buzz. Once you have saved your changes in the “Options” you will be prompted to refresh your Gmail tab to see the changes. Going back to our “Inbox Area” everything looks so much more streamlined and clean now. Goodbye clutter! The “New Window, Print all, & Create a document Commands” definitely look a lot nicer as a small toolbar above our e-mail. And the right side…you can see for yourself just how much better that looks. No more distractions there to bother you as you read your e-mail. Conclusion If you have been wanting to get rid of the undesirable elements visible in your Gmail account then hurry over to the Better Gmail page, grab the extension and enjoy the better view. Links Download the Better Gmail extension (Google Chrome Extensions) Similar Articles Productive Geek Tips Figure out which Online accounts are selling your email to spammersAdd a Remember The Milk Task Pane to Gmail in ChromeHow to Send and Receive Hotmail from Your Gmail AccountAdd Your Gmail To Windows Live MailOpen Your Gmail Account in a Popup Window TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional Windows Media Player 12: Tweak Video & Sound with Playback Enhancements Own a cell phone, or does a cell phone own you? Make your Joomla & Drupal Sites Mobile with OSMOBI Integrate Twitter and Delicious and Make Life Easier Design Your Web Pages Using the Golden Ratio Worldwide Growth of the Internet

    Read the article

  • Failing with Adsense / How to get $ PC

    - by cam77
    So, I am literally just starting out with google adsense. I have implemented google adsense ads from (1) account and from (2) different channels, per website they're on. (2 Wordpress Websites, activated with 'GoogleAdsense Plugin' for WordPress). They are implemented at the bottom of every post on my 2 Wordpress websites within the 'Blog' sections. My adsense dashboard is stating I've received a few clicks; but across it still states my account earnings and balance at $0.00. When / How will I start seeing money earned to my account?

    Read the article

  • how to track Google Analytics of Adobe Air app?

    - by dreagan
    I have written an Adobe Air desktop application that tracks a bunch of websites and displays images from the websites in the app. And instead of keeping my mouth shut about it and making it look like an attack on the website, I'd like to make it so that the webmasters can see that these pageviews are made by my application. Is there any way the webmaster could distinguish adobe air access of the website from normal visitor browsing? Perhaps by adding something to the URLrequests I make in the application..?

    Read the article

  • Google I/O 2012 - Making Google Product Search Work for You Using the Content API for Shopping

    Google I/O 2012 - Making Google Product Search Work for You Using the Content API for Shopping Mayuresh Saoji, Danny Hermes To get the best out of product search, merchants need to provide complete and accurate product information, as well as fresh price and availability data for all products. This session will provide merchants with concrete steps they can take to improve their data quality using the Content API for Shopping. We will provide details on when it makes sense to use the Content API to submit data (as opposed to Feeds), and how to use the API. We will also go into details on how to debug API requests and errors, and talk about general best practices to follow in order to use the API optimally and efficiently. For all I/O 2012 sessions, go to developers.google.com From: GoogleDevelopers Views: 35 1 ratings Time: 43:50 More in Science & Technology

    Read the article

  • Panda 4: Reducing #indexed pages. How much is enough?

    - by Noam
    I've been hit by panda 4 (40% decrease). I didn't see any change during panda 1-3. From what I've read it and when compared to my site, the change is probably due to the fact that I have over 30M pages indexed on Google, and they've starting seeing that as some sort of bad indication. Although I feel all of the pages have a unique value that Google should crawl, it seems I should make some tough calls and deduce the indexed pages according to some prioritization I will conduct. The question is what should be my target, or what factors should help me figure out a relevant target. How many pages should I try to reduce to? - 25M - 15M - 1M - 2000 Is it enough to add noindex to low priority pages or should I also remove all internal linking to them?

    Read the article

  • Getting the keyword as a parameter from Adwords using ValueTrack

    - by Stephen Ostermiller
    I set up an AdWords campaign for website following the instructions for Google AdWords ValueTrack. One of the things that it is supposed to be able to do is pass the keyword as a URL parameter using the code {keyword} in the URL. I set it up for integration with Google Analytics such the landing URLs would look like: http://example.com/landing.html?utm_source=adwords&utm_medium=cpc&utm_term=%7Bkeyword%7D&utm_content=my_content&utm_campaign=my_page where {keyword} is in the utm_term parameter. Hower, this keyword substitution isn't happening. Why?

    Read the article

  • How do I change the Google Chrome offline icon?

    - by user1105047
    I am using some offline apps for google chrome and because of this an application indicator pops un in the gnome panel. My problem is that this indicator uses the google chrome default system tray icon and I think it looks ugly with my current theme so I would basically like to change the icon. I can't find the icons that google chrome is using for this purpose but I have no problem finding the other icons used by google chrome and changing them, like the 24x24 or 22x22 icons that you see in the gnome application menu (which by default looks like the application indicator for google chrome). It doens't seem like Google Chrome is taking the icon from e.g /usr/share/icons/ and I can't change the icon by changing the icon theme in like gconf-editor Is there anyway to see the preferences (like you can do with launchers) of the application indicators in the gnome-panel and then locate the indicator icon or change it in another way?

    Read the article

  • SEO issue - External links rel="nofollow" or NOT!

    - by Mary Melody
    Previously I created a website for promo codes and coupons. That have hundreds of external links to the Retailer's websites and I used rel="nofollow" tag. But my site SEO rank was very very... bad especially on Google. So then I removed the rel="nofollow" tag, but no improvement. The only difference between this site and my other sites is "External links". My other sites have good ranking on Google. Now I'm creating a site for reviews. So this is also similar situation for me. I just want to know about how does SEO react with external links? and then what possibly happened in my case?

    Read the article

  • What to do with a site that has multiple languages in Google Analytics...

    - by stephmoreland
    We have a site that has four "streams" for language and each language has different content based on that language and location (US English, Spanish, Canadian English and Canadian French). I'm wondering if I have to set up accounts for each stream so that we can see the stats from each stream only, or do I use one account and somehow tell GA to separate the different streams based on language. For example, the US English site starts at (/en/) while the Canadian English site starts at (/ca_en/), etc.

    Read the article

  • 404 code/header for search engines, on removed user content?

    - by mowgli
    I just got an email, from a former user on my website He was complaining that Google still shows the contact page he created on my site, even though he deleted it a month ago This is the first time in many years anyone requests this I told him, that it's almost entirely up to Google what content it wants to keep/show and for how long. If it's deleted on the site, I can't do much, other than request a re-visit from the googlebot The user-page already now says something like "Not found. The user has removed the content" TL;DR: But the question is: Should I generally add a 404 header (or other) for dynamic user content that has been removed from the site? Or could this hurt the site (SEO)?

    Read the article

  • Google Analytics - Profile filter with more than one dimension?

    - by Drewdavid
    I have identified some bot traffic in GA, but to filter it accurately I need to filter it by two dimensions, namely Browser and ISP To be clear, I don't want to apply both a filter to block the entire ISP and the entire Browser segments, but only the combination of the two To illustrate: Could someone explain how to do this? It's not apparent using the interface and I'm not able to find any documentation about it

    Read the article

  • Google Analytics custom variables and how are they recorded?

    - by mrtsherman
    I have been asked to add GA custom variable tracking to my company's website. The company website uses server side includes, so making modifications to the tracking code happens identically everywhere. Maintenance is therefore a headache. Also, GA takes about twenty-four hours for custom variables to start showing up in reports and that makes troubleshooting a headache. So if you have custom variables // visitor level tracking, id = 12345 _gaq.push(['_setCustomVar', 1, 'id', '12345', 1]); // page level tracking, email = [email protected] _gaq.push(['_setCustomVar', 1, 'email', '[email protected]', 1]); The marketing people want the following out of this: User visits site and we record a unique id for them. Whenever they return this id will be used in GA. User signs up for our newsletter on page X and we record their email address. Whenever they return this email address is used in GA. Now a big problem for me is that I don't use GA and the marketing people don't use custom variables. So we don't actually know how this will work. Do I want Page, Session or Visitor level tracking? What happens because the same GA code is used on every page? If they visit the email sign up form and we record the email address, but then they go somewhere else where email is nonexistent will the value get 'overwritten.' Sorry for the long question, but there are a lot of unknowns for a GA noob.

    Read the article

  • Where to get ads for my website ?

    - by Divyanshu Negi
    I am the developer of the website named viewloud Now as my website is getting around 100 visitors per day so i was thinking that i should put some ads on my website but it is really very hard to find the best advertising plan which can benefit me the most. Google adsence . Is it a good choice ? Will google adsence allow me to open a account there with such little traffic on my website. ? Please help from last two months i have done a very hard work to bring such tarffic :p i know it is very less but i am still working on it .. so please help me guys. Thank you Divyanshu

    Read the article

  • Getting web results URLs in millions [closed]

    - by tereško
    I looked at all sites of SO and couldn't find any suitable to ask this question but posting here as nearest match to scenario After 1 months research I basically give up on getting all URL's from a search results programmatically, I looked at Google Search API to find a way to get millions of search results "URL's" to be specific to a text file or something relative but no success, but I am 100% there must be a way or trick of doing it. Real Question : Is there anyway programmatically or manually I can get 1000+ search results (URLs using search query e.g. "Apple" returns million of results on google and I want as much as possible URLs of them results in a text file)

    Read the article

< Previous Page | 74 75 76 77 78 79 80 81 82 83 84 85  | Next Page >