Search Results

Search found 1669 results on 67 pages for 'predictive analytics'.

Page 31/67 | < Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >

  • Facebook Like button data for a domain?

    - by mungojerie
    We make a widget for media sites and have included the Facebook Like button, so anyone who installs our widget gets the Like button on every page without having to do additional integration work. We'd like to show the site owner some basic data / analytics about Like activity... which URLs are liked, and how many times. I can't figure out if this is even available via the Graph API or FQL. If it is, I can't tell if I need an app_id and secret token for each domain or not. Any suggestions?

    Read the article

  • Architecture for analysing search result impressions/clicks to improve future searches

    - by Hais
    We have a large database of items (10m+) stored in MySQL and intend to implement search on metadata on these items, taking advantage of something like Sphinx. The dataset will be changing slightly on a daily basis so Sphinx will be re-indexing daily. However we want the algorithm to self-learn and improve search results by analysing impression and click data so that we provide better results for our customers on that search term, and possibly other similar search terms too. I've been reading up on Hadoop and it seems like it has the potential to crunch all this data, although I'm still unsure how to approach it. Amazon has tutorials for compiling impression vs click data using MapReduce but I can't see how to get this data in a useable format. My idea is that when a search term comes in I query Sphinx to get all the matching items from the dataset, then query the analytics (compiled on an hourly basis or similar) so that we know the most popular items for that search term, then cache the final results using something like Memcached, Membase or similar. Am I along the right lines here?

    Read the article

  • Regular expression for a string containing one word but not another

    - by Chris Stahl
    I'm setting up some goals in Google Analytics and could use a little regex help. Lets say I have 4 URLs http://www.anydotcom.com/test/search.cfm?metric=blah&selector=size&value=1 http://www.anydotcom.com/test/search.cfm?metric=blah2&selector=style&value=1 http://www.anydotcom.com/test/search.cfm?metric=blah3&selector=size&value=1 http://www.anydotcom.com/test/details.cfm?metric=blah&selector=size&value=1 I want to create an expression that will identify any URL that contains the string selector=size but does NOT contain details.cfm I know that to find a string that does NOT contain another string I can use this expression: (^((?!details.cfm).)*$) But, I'm not sure how to add in the selector=size portion. Any help would be greatly appreciated!

    Read the article

  • Register now to a complementary Oracle Health Sciences 3-day workshop on Enterprise Healthcare Analytics training in Dallas, US, Nov 12-14, 2013!

    - by Roxana Babiciu
    Join Oracle Health Sciences for an informative overview for Sales / Business Development and Implementation team members on Oracle Enterprise Healthcare Analytics (EHA). You’ll gain an understanding of the Oracle EHA product strategy, garner a platform overview and hear customer success stories that will enable you in the field. Be ready for technical education and training spanning three days of deep expertise sharing.

    Read the article

  • PHP DOM vs SimpleXML for Atom GData feed parsing

    - by Geoff Adams
    I'm building a library to access the Google Analytics Data Export API. All the data the library accesses is in Atom format and utilises numerous different namespaces throughout. My experiments with the API have used SimpleXML for parsing so far, especially as all I have been doing is accessing the data held within the feed. Now I'm coming to write a library I am wondering whether forging ahead with SimpleXML will be adequate or whether the enhanced functionality of the DOM module in PHP would be of benefit in the future. I haven't written much code for this part of the library yet so the choice is still open. I have read that the PHP DOM module can be a better choice if you need to build an XML DOM on the fly or modify an existing one, but I'm not entirely sure I would need that functionality anyway due to the nature of the API (no pushing data to the server, for instance). SimpleXML is certainly easier to use and I have seen people saying that for read-only situations it is all you need. Essentially the question is, what would you use? Compatibility will not be an issue as the server configuration will match the application's requirements. Is it worth building the library with PHP DOM in mind or should I stick with SimpleXML for now? Update: Here are two examples of the kind of feeds I will be dealing with: Account feed Data feed

    Read the article

  • Collecting high-volume video viewing data

    - by DanK
    I want to add tracking to our Flash-based media player so that we can provide analytics that show what sections of videos are being watched (at the moment, we just register a view when a video starts playing) For example, if a viewer watches the first 30 seconds of a video and then clicks away to something else, we want the data to reflect that. Likewise, if someone watches the first 10 seconds, then scrubs the timeline to the last minute of the video and watches that, we want to register viewing on the parts watched and not the middle section. My first thought was to collect up the viewing data in the player and send it all to the server at the end of a viewing session. Unfortunately, Flash does not seem to have an event that you can hook into when a viewer clicks away from the page the movie is on (probably a good thing - it would be open to abuse) So, it looks like we're going to have to make regular requests to the server as the video is playing. This is obviously going to lead to a high volume of requests when there are large numbers of simultaneous viewers. The simple approach of dumping all these 'heartbeat' events from clients to a database feels like it will quickly become unmanageable so I'm wondering whether I should be taking an approach where viewing sessions are cached in memory and flushed to database when they become inactive (based on a timeout). That way, the data could be stored as time spans rather than individual heartbeats. So, to the question - what is the best way to approach dealing with this kind of high-volume viewing data? Are there any good existing architectures/patterns? Thanks, Dan.

    Read the article

  • How do I use a jQuery not selector to select relative URLs?

    - by Matt
    I'm working on a little jQuery script to add Google Analytics pageTracker onclick data to all relative URLs on my forum, allowing me to track clicks to external sites. I don't want to add the onclick to internal links on forum.sitename or sitename, and I don't want to add them to any hrefs marked # or that start with /. My script below works nicely, but for one minor problem! All of the forum's URLs are relative and don't start with /. I appear to have no way to change that, so need to modify the jQuery below to prevent it adding the onclick to links like as it currently does. What I want to do, is to write a .not() function like .not("[href!^=http") to prevent jQuery from adding the onclick to any hrefs which do not start with http. However, .not() appears not to support this. I'm new to jQuery and can't figure this out. Any pointers would be massively appreciated. $(document).ready(function(){ // Get URL from a href var URL = $("a").attr('href'); // Add pageTracker data for GA tracking $("a") .not("[href^=#]") .not("[href^=http://forum.sitename]") .not("[href^=http://www.sitename]") .attr("onclick","pageTracker._trackEvent('Outgoing_Links', 'Forum', " + URL + ");") ; }); Thanks!

    Read the article

  • Good analytics tools that can track visitor actions from a particular source?

    - by tnorthcutt
    Are there good tools that can track what actions a certain subset of visitors (i.e. from a particular source) do once they're on your site? As far as I know (which could be wrong), Google Analytics can't do this beyond telling you how long they stayed, bounce rate, and average number of pages. I'm looking for something that can tell me which links they clicked on, and if possible break it down per-visitor. Free solutions would be great, but I'm anticipating that this would require a paid solution.

    Read the article

  • Essbase Analytics Link (EAL) - Performance of some operation of EAL could be improved by tuning of EAL Data Synchronization Server (DSS) parameters

    - by Ahmed Awan
    Generally, performance of some operation of EAL (Essbase Analytics Link) could be improved by tuning of EAL Data Synchronization Server (DSS) parameters. a. Expected that DSS machine will be 64-bit machine with 4-8 cores and 5-8 GB of RAM dedicated to DSS. b. To change DSS configuration - open EAL Configuration Tool on DSS machine.     ->Next:     and define: "Job Units" as <Number of Cores dedicated to DSS> * 1.5 "Max Memory Size" (if this is 64-bit machine) - ~1G for each Job Unit. If DSS machine is 32-bit - max memory size is 2600 MB. "Data Store Size" - depends on number of bridges and volume of HFM applications, but in most cases 50000 MB is enough. This volume should be available in defined "Data Store Dir" driver.   Continue with configuration and finish it. After that, DSS should be restarted to take new definitions.  

    Read the article

  • How are certain analytics metrics (time on site, etc.) usually distributed?

    - by a barking spider
    I'm not sure if I've come to the right place to ask this question, but I'm gathering some information for a research project. We're trying to design an experiment that'll heavily involve web analytics, and I'm trying to figure out some sensible values of mean +/- standard deviation for the following visitor-level (i.e., visitor 1 spends 2 minutes on site, visitor 2 spends 1 minute -- mean 1.5 +/- 0.71...) metrics: time spent on site page views If time allowed, we would put up the sites and gather the information ourselves, but we have a grant deadline coming up. I realize that even though these the distributions of these quantities are probably going to be heavily skewed towards zero, we'll need some reasonable figures or estimates of these figures in order to do sample size calculations, etc. Anyway, I'm not sure where else I'd turn, and I certainly have had a difficult time finding these values in the prior literature. If someone could direct me to a paper with the right information, or if you have these figures on hand (perhaps taken directly from your logs!) -- that would be amazing, and I'd love to hear from you. Thanks in advance, and even though I'm not allowed to reveal too much, rest assured that this info'll be applied towards a good cause :)

    Read the article

  • Using a dynamic <script> (a <script> appended to the DOM by JavaScript) to load and initialize Click

    - by Bungle
    I'm creating an HTML/JavaScript widget to be used on third-party sites. This widget is generated by a <script> that our customers will insert on their page. The <script> creates an <iframe> in the customer's domain, and then creates and inserts all of that <iframe>'s content using JavaScript. It's important that this <iframe> contain Clicky's tracking code to monitor clicks on outbound links. Unfortunately, I'm not having any luck getting Clicky to work when I append the requisite <script> elements to the <iframe> using JavaScript. I first tried simply appending the Clicky tracking code to the <iframe> after appending some test outbound links, hoping that Clicky could attach to those automatically as it does on a static page. That didn't seem to work, so my next inclination was to use the "advanced_disable" custom option and use clicky.log() on the links I want to track. Here's a link to a test page that's along those lines: http://onespot.wsj.com/static/clicky_iframe_test.html When clicking a link on that test page, the action is not logged in Clicky, and a JavaScript error appears: clicky is not defined This ("clicky") appears to be defined in http://static.getclicky.com/js, which I confirmed through the Firebug console is indeed loading before I click a test outbound link. Has anyone successfully loaded Clicky in this way? If so, could you provide some sample code, a link to a working implementation, or some feedback on what's wrong with my code? I would also be interested to know if this is even possible. Thanks very much for any help or advice!

    Read the article

  • What does "?ref=ts" mean in a FACEBOOK APP url

    - by jozecuervo
    When Facebook drives traffic to an app, they often append &ref=whatever to the query string. This is useful for figuring out which integration points are working or not. I've figured out what some of these mean. For example: ref=bookmarks - the user clicked on a bookmark. ref=game_my_recent - the user clicked on the upper portion of the games dashboard. Does anyone know what "ref=ts" means? It accounts for a ton of traffic. I've viewed source on pages all over common FB pages and cannot find a match for ant piece of content generated by any of my apps. Same question, posted by me on the FB DEV FORUM: http://forum.developers.facebook.com/viewtopic.php?id=54866 I'm thinking I'll get better answers here ;) Thanks people, Jose

    Read the article

  • rank on two dates - each date iteratively

    - by Abhi
    How to query for rank over 'value' for each day in the below table? Ex: IT should list out the 'mydate', 'value', 'rank' for all values on 20th and then do a fresh rank() for all values on 21st? Thanks... create table tv (mydate,value) as select to_date('20/03/2010 00','dd/mm/yyyy HH24'),98 from dual union all select to_date('20/03/2010 01','dd/mm/yyyy HH24'),124 from dual union all select to_date('20/03/2010 02','dd/mm/yyyy HH24'),140 from dual union all select to_date('20/03/2010 03','dd/mm/yyyy HH24'),138 from dual union all select to_date('20/03/2010 04','dd/mm/yyyy HH24'),416 from dual union all select to_date('20/03/2010 05','dd/mm/yyyy HH24'),196 from dual union all select to_date('20/03/2010 06','dd/mm/yyyy HH24'),246 from dual union all select to_date('20/03/2010 07','dd/mm/yyyy HH24'),176 from dual union all select to_date('20/03/2010 08','dd/mm/yyyy HH24'),124 from dual union all select to_date('20/03/2010 09','dd/mm/yyyy HH24'),128 from dual union all select to_date('20/03/2010 10','dd/mm/yyyy HH24'),32010 from dual union all select to_date('20/03/2010 11','dd/mm/yyyy HH24'),384 from dual union all select to_date('20/03/2010 12','dd/mm/yyyy HH24'),368 from dual union all select to_date('20/03/2010 13','dd/mm/yyyy HH24'),392 from dual union all select to_date('20/03/2010 14','dd/mm/yyyy HH24'),374 from dual union all select to_date('20/03/2010 15','dd/mm/yyyy HH24'),350 from dual union all select to_date('20/03/2010 16','dd/mm/yyyy HH24'),248 from dual union all select to_date('20/03/2010 17','dd/mm/yyyy HH24'),396 from dual union all select to_date('20/03/2010 18','dd/mm/yyyy HH24'),388 from dual union all select to_date('20/03/2010 19','dd/mm/yyyy HH24'),360 from dual union all select to_date('20/03/2010 20','dd/mm/yyyy HH24'),194 from dual union all select to_date('20/03/2010 21','dd/mm/yyyy HH24'),234 from dual union all select to_date('20/03/2010 22','dd/mm/yyyy HH24'),328 from dual union all select to_date('20/03/2010 23','dd/mm/yyyy HH24'),216 from dual union all select to_date('21/03/10 00','dd/mm/yyyy HH24'),224 from dual union all select to_date('21/03/10 01','dd/mm/yyyy HH24'),292 from dual union all select to_date('21/03/10 02','dd/mm/yyyy HH24'),264 from dual union all select to_date('21/03/10 03','dd/mm/yyyy HH24'),132 from dual union all select to_date('21/03/10 04','dd/mm/yyyy HH24'),142 from dual union all select to_date('21/03/10 05','dd/mm/yyyy HH24'),328 from dual union all select to_date('21/03/10 06','dd/mm/yyyy HH24'),184 from dual union all select to_date('21/03/10 07','dd/mm/yyyy HH24'),240 from dual union all select to_date('21/03/10 08','dd/mm/yyyy HH24'),224 from dual union all select to_date('21/03/10 09','dd/mm/yyyy HH24'),496 from dual union all select to_date('21/03/10 10','dd/mm/yyyy HH24'),370 from dual union all select to_date('21/03/10 11','dd/mm/yyyy HH24'),352 from dual union all select to_date('21/03/10 12','dd/mm/yyyy HH24'),438 from dual union all select to_date('21/03/10 13','dd/mm/yyyy HH24'),446 from dual union all select to_date('21/03/10 14','dd/mm/yyyy HH24'),426 from dual union all select to_date('21/03/10 15','dd/mm/yyyy HH24'),546 from dual union all select to_date('21/03/10 16','dd/mm/yyyy HH24'),546 from dual union all select to_date('21/03/10 17','dd/mm/yyyy HH24'),684 from dual union all select to_date('21/03/10 18','dd/mm/yyyy HH24'),568 from dual union all select to_date('21/03/10 19','dd/mm/yyyy HH24'),504 from dual union all select to_date('21/03/10 20','dd/mm/yyyy HH24'),392 from dual union all select to_date('21/03/10 21','dd/mm/yyyy HH24'),256 from dual union all select to_date('21/03/10 22','dd/mm/yyyy HH24'),236 from dual union all select to_date('21/03/10 23','dd/mm/yyyy HH24'),168 from dual

    Read the article

  • Google Ajax search API

    - by jAndy
    Hi Folks, I'm wondering, is it possible to receive google results over their own ajax API in a way like, 100 results per page? Without a visible search field, I'd like to get the results in the background to create a progression for some search phrases. My basic question is, what are the restrictions of the google search api ? Kind Regards --Andy

    Read the article

  • how to developing "document plagiarism checker" website in asp.net?

    - by user1637402
    i know this website write-check his functionality is uploading a file(PDF,Doc) and check percentage of redundancy between the file uploaded and a lot of websites ,books,researches and after user upload file and result shows that result show redundancy percentage and highlight on copied paragraphs . that paragraphs were repeated in website references when user hover on these highlights the source or references appear to the user to make sure the source he copied from this is explain simply for website functionality can any one help me in analysis for asp.net website has the same functionality and how check between uploaded file and archived files

    Read the article

  • From SEO point of view, is it better to use Domain-Dash.com or Domainwithoutdash.com?

    - by Msc. Adrian Lopez
    I have been reading forums and so, but found not a clear answer or nor conclusive, about the strategic decission of using domain-with-dash.com or notusingdashes.com Is there a problem or disadvantage in ranking for those key words? Is it better having the-domain-with-dash.com than shortdomain.net? many cases you dont have the dot.com available for that specific key word. what are your opinions, please prove facts, or add links to the source. What Google has to say?

    Read the article

  • First Party and Third Party Cookie

    - by ajithperuva
    I want to create a analysis project (just like google analysis),for getting conversion rate and track visitor count.How can we create first party cookie and third party cookie using php.Actually, how can we identify our third party and first party cookie.Need to follow any type of standard for identify them?if anybody know please give me some idea about it...please

    Read the article

  • Custom iPhone analytic tool

    - by Ondrej
    Do you think that if I'll build my own custom analytic tool (Flurry, Pinchmedia) and I'll host that on the same server where I have my data source for the application, will Apple consider this as a thirdparty analytic tool or not? ... Problem is that Flurry and Pinch are being banned from Appstore by the newest T&C ... than I thought that I'll build an open source library that will allow anyone to have their own analytic installed on their server ... Thanks, Ondrej

    Read the article

  • Track each request to the website using HttpModule

    - by stacker
    I want to save each request to the website. In general I want to include the following information: User IP, The web site url, user-if-exist, date-time. Response time, response success-failed status. Is it reasonable to collect the 1 and 2 in the same action? (like same HttpModule)? Do you know about any existing structure that I can follow that track every request/response-status to the website? The data need to be logged to sql server.

    Read the article

  • SVN Attribution Plugin?

    - by Rosarch
    I'm using SVN with Google Code Project Hosting for a school project. As the codebase increases in size, I often find myself wondering questions like: "who originally checked in this line of code?" "who has been checking in the most code recently?" "Of the final product, how much of it was written by Person X?" "Which coder is best at adhering to the coding conventions?" Is there any plugin available to do this? (If not, I would be interested in developing one myself. Any ideas on where to get started on that?) We're using Visual Studio 2008 with the AnkhSVN plugin.

    Read the article

  • Google Translation API not working for even one page long documents

    - by Saubhagya
    I'm using Google Translation API to translate text from Chinese Simplified to English in my C# program. The problem is if the text is small (around one line) the API is able to translate it, but if the text is larger (more than 3 lines) is gives an exception saying "The remote server returned an unexpected response: (414) Request-URI Too Large.". However if I use translate.google.com in my browser that works fine. Please tell me how can I process large documents using Google Translate API in my desktop application written in C#.

    Read the article

< Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >