Search Results

Search found 4072 results on 163 pages for 'social analytics'.

Page 116/163 | < Previous Page | 112 113 114 115 116 117 118 119 120 121 122 123  | Next Page >

  • get particular string using regex java

    - by hussain
    i want to know how to get the string from group of string String j = "<a href=\"/watch?v=4Qx-lBqOqiQ&feature=popular\" onclick=\"\" onmousedown=\"yt.analytics.urchinTracker(\'/Events/Home/PersonalizedHome/POP/Logged_Out');\" ><span class=\"video-thumb video-thumb-220 \" id=\"video-thumb-4Qx-lBqOqiQ-8821469\"><img src=\"http://i1.ytimg.com/vi/4Qx-lBqOqiQ/hqdefault.jpg\" class=\"vimg220\" alt=\"Dog Squirrel Chasing A Squirrel\" title=\"Dog Squirrel Chasing A Squirrel\" onclick=\";yt.www.watch.watch5.IEshenanigans(event, this)\"><span class=\"video-time\"><span>1:08</span></span><span class=\"video-actions\"><button class=\"yt-uix-button-short yt-uix-button yt-uix-button-arrowbutton\" onclick=\"; return false;\" type=\"button\"> <img class=\"yt-uix-button-arrow\" src=\"http://s.ytimg.com/yt/img/pixel-vfl73.gif\" alt=\"\"> hai</a>"; i want to get the string href=\"/watch?v=4Qx-lBqOqiQ&feature=popular\" and src=\"http://i1.ytimg.com/vi/4Qx-lBqOqiQ/hqdefault.jpg\" thanks and advance

    Read the article

  • Is Using Python to MapReduce for Cassandra Dumb?

    - by UltimateBrent
    Since Cassandra doesn't have MapReduce built in yet (I think it's coming in 0.7), is it dumb to try and MapReduce with my Python client or should I just use CouchDB or Mongo or something? The application is stats collection, so I need to be able to sum values with grouping to increment counters. I'm not, but pretend I'm making Google analytics so I want to keep track of which browsers appear, which pages they went to, and visits vs. pageviews. I would just atomically update my counters on write, but Cassandra isn't very good at counters either. May Cassandra just isn't the right choice for this? Thanks!

    Read the article

  • Private JQuery instance

    - by Nir Levy
    We are writing a SaaS like solution that requires our customers to SCRIPT SRC some javascript code we are building (think Google Analytics scenario). We would like to use JQuery. However, since our customers might already have conflicting JQuery versions or other conflicting frameworks (prototype.js for one) we cannot tell them to source jquery.js. We were thinking of coping the jquery source as to create a 'private' jquery instance and simple search/replace the JQuery and $ functions with myJQuery and $J Is there any reason for this not to work? has anyone tried something like this? What can we do about plugins?

    Read the article

  • How to calculate real-time stats?

    - by Diego Jancic
    I have a site with millions of users (well, actually it doesn't have any yet, but let's imagine), and I want to calculate some stats like "log-ins in the past hour". The problem is similar to the one described here: http://highscalability.com/blog/2008/4/19/how-to-build-a-real-time-analytics-system.html The simplest approach would be to do a select like this: select count(distinct user_id) from logs where date>='20120601 1200' and date <='20120601 1300' (of course other conditions could apply for the stats, like log-ins per country) Of course this would be really slow, mainly if it has millions (or even thousands) of rows, and I want to query this every time a page is displayed. How would you summarize the data? What should go to the (mem)cache? EDIT: I'm looking for a way to de-normalize the data, or to keep the cache up-to-date. For example I could increment an in-memory variable every time someone logs in, but that would help to know the total amount of logins, not the "logins in the last hour". Hope it's more clear now.

    Read the article

  • How to get the request url from HttpServletRequest

    - by Gagan
    Say i make a get request like this: GET http://cotnet.diggstatic.com:6000/js/loader/443/JS_Libraries,jquery|Class|analytics|lightbox|label|jquery-dom|jquery-cookie?q=hello#frag HTTP/1.0 Host: cotnet.diggstatic.com:6000 My servlet takes request like this: HttpServletRequest req; When i debug my server and execute, i get the following: req.getRequestURL().toString() = "http://cotnet.diggstatic.com:6000/js/loader/443/JS_Libraries,jquery%7cClass%7canalytics%7clightbox%7clabel%7cjquery-dom%7cjquery-cookie" req.getRequestURI() = "/js/loader/443/JS_Libraries,jquery%7cClass%7canalytics%7clightbox%7clabel%7cjquery-dom%7cjquery-cookie" req.getQueryString() = "q=hello" How does one get the fragment information ? Also, when i debug the request, i see a uri_ field of type java.net.URI which has the fragment information. This is exactly what i want. How can i get that ?

    Read the article

  • How to set up Webtrends to segment paid search campaigns?

    - by hfidgen
    Hiya, I'm trying to learn how to set up Webtrends Analytics to segment paid search properly. I've dug up the fact that you need to pass "WT.srch=1" over in the url from the advert, but are there other parameters which you can use to segment: Campaign Adgroup Ad network I know you can do this with Google Analtics at the drop of a button using the various UTM tags, but Webtrends seems to use the proprietary WT. tags. Yet I can only find one tag... wtf :P Anyone able to help? Thanks! H

    Read the article

  • How to add iphone libs *.a files to xCode's SVN (CSM)?

    - by slatvick
    Have: xCode project with Google Analytics lib, could be normally compiled. Want just to put it to already working SVN to build project from the work macosx without any additional steps. I've tried different ways to add *.a file to the svn, but all just have not worked. When adding a directory there is all files except *.a in the svn. I bet there is not such problem with 3d party SVN clients, but want to give the xCode one more chance, so asking there. Guys, is it possible to add *.a files to SVN using xCode?

    Read the article

  • generic async loading method for page web scripts?

    - by boomhauer
    The google analytics code went to an async load model some time back. I've noticed that a lot of the other scripts I use on many sites are causing slow load times - specifically the addthis script and the facebook like button. I'm noticing that the slow load times of these scripts is causing the google bot to calc my page loadtimes as being much slower than previously. I'd like to know if there is a standard/generic way of causing these scripts to load async as well, or perhaps a pointer to someone who has done the work for this already. Seems this would be a popular thing to do, but not much luck searching around.

    Read the article

  • Using jQuery to call a web service

    - by Matt
    I have created a web service which takes a username and password as parameters and returns a list of children in JSON (the user is a Social Worker). The web service is hosted locally with IIS7. I am attempting to access the web service using javascript/jquery because it will eventually need to run as a mobile app. I'm not really experienced with web services, or javascript for that matter, but the following two links seemed to point me in the right direction: http://williamsportwebdeveloper.com/cgi/wp/?p=494 http://encosia.com/using-jquery-to-consume-aspnet-json-web-services/ This is my html page: <%@ Page Title="" Language="C#" MasterPageFile="~/MasterPage.Master" AutoEventWireup="true" CodeBehind="TestWebService.aspx.cs" Inherits="Sponsor_A_Child.TestWebService" %> <asp:Content ID="Content1" ContentPlaceHolderID="stylesPlaceHolder" runat="server"> <script type="text/javascript" src="Scripts/jquery-1.7.1.js"> $(document).ready(function () { }); function LoginClientClick() { $("#query_results").empty(); $("#query_results").append('<table id="ResultsTable" class="ChildrenTable"><tr><th>Child_ID</th><th>Child_Name</th><th>Child_Surname</th></tr>'); $.ajax({ type: "POST", contentType: "application/json; charset=utf-8", url: "http://localhost/PhoneWebServices/GetChildren.asmx/GetMyChildren", data: '{ "email" : "' + $("#EmailBox").val() + '", "password": "' + $("#PasswordBox").val() + '" }', dataType: "json", success: function (msg) { var c = eval(msg.d); alert("" + c); for (var i in c) { $("#ResultsTable tr:last").after("<tr><td>" + c[i][0] + "</td><td>" + c[i][1] + "</td><td>" + c[i][2] + "</td></tr>"); } } }); } </script> </asp:Content> <asp:Content ID="Content2" ContentPlaceHolderID="contentPlaceHolder" runat="server"> <div id="LoginDiv"> Email: <input id="EmailBox" type="text" /><br /> Password: <input id="PasswordBox" type="password" /><br /> <input id="LoginButton" type="button" value="Submit" onclick="LoginClientClick()" /> </div> <div id="query_results"> </div> </asp:Content> And this is my web service code: [WebMethod (Description="Returns the list of children for whom the social worker is responsible.")] public String GetMyChildren(String email,String password) { DataSet MyChildren=new DataSet(); int ID=SocialWorkerLogin(email, password); if (ID > 0) { MyChildren = FillChildrenTable(ID); } MyChildren.DataSetName = "My Children"; //To prevent 'DataTable name not set' error string[][] JaggedArray = new string[MyChildren.Tables[0].Rows.Count][]; int i = 0; foreach (DataRow rs in MyChildren.Tables[0].Rows) { JaggedArray[i] = new string[] { rs["Child_ID"].ToString(), rs["Child_Name"].ToString(), rs["Child_Surname"].ToString() }; i = i + 1; } // Return JSON data JavaScriptSerializer js = new JavaScriptSerializer(); string strJSON = js.Serialize(JaggedArray); return strJSON; } I followed the examples in the provided links, but when I press submit, only the table headers appear but not the list of children. When I test the web service on it's own though, it does return a JSON string so that part seems to be working. Any help is greatly appreciated :)

    Read the article

  • SQL server virtual memory usage and perofrmance

    - by user365035
    Hello, I have a very large DB used mostly for analytics. The performance overall is very sluggish. I just noticed that when running the query below, the amount of virtual memory used greatly exceed the amount of physical memory available. Currently, phsycial memory is 10GB (10238 bytes) where as the virtual memory returns significantly more 8388607 bytes. That seems really wrong, but I'm at a bit of a loss on how to proceed. USE [master]; GO select cpu_count , hyperthread_ratio , physical_memory_in_bytes / 1048576 as 'mem_MB' , virtual_memory_in_bytes / 1048576 as 'virtual_mem_MB' , max_workers_count , os_error_mode , os_priority_class from sys.dm_os_sys_info

    Read the article

  • WinForms / .Net interactive world map - how?

    - by FerretallicA
    In a CD collection program, I have each artist's country of origin stored in the main database and want to display a map of the world which: Colour-codes each country depending on the number of CDs by artists in that country Allows clicking on each country to filter a list of CDs to only ones by artists in that country This is a heavily simplified version of what I'm trying to do, but if I can at least get this far the rest should be easy enough to figure out. So far the closest thing I've found to what I'm trying to do is here: http://www.synergetechsolutions.com/blog/analytics-world-map-control Ideally I don't want to be embedding Flash in my program though, and the only other solutions I've found all involve SVG which I haven't managed to get working in practice outside of a web browser control (and I DEFINITELY don't want to be embedding a browser in the forms). Something in pure managed code and either GDI+ or WPF would be preferable. Are there any existing components that would get me started, or can anyone suggest how to approach it from scratch?

    Read the article

  • Unicorn: Which number of worker processes to use?

    - by blackbird07
    I am running a Ruby on Rails app on a virtual Linux server that is capped at 1GB RAM. Currently, I am constantly hitting the limit and would like to optimize memory utilization. One option I am looking at is reducing the number of unicorn workers. So what is the best way to determine the number of unicorn workers to use? The current setting is 10 workers, but the maximum number of requests per second I have seen on Google Analytics Real-Time is 3 (only scored once at a peak time; in 99% of the time not going above 1 request per second). So is it a save assumption that I can - for now - go with 4 workers, leaving room for unexpected amounts of requests? What are the metrics I should have a look at for determining the number of workers and what are the tools I can use for that on my Ubuntu machine?

    Read the article

  • PHP script for creating calendar table or jquery complete solution required

    - by finn_meister
    Ok so this is what I want to make: http://i44.tinypic.com/eiwphl.jpg red = booked green = available I have data in mysql in the format of: property_id, booked_from, booked_until . Before I start trying to create the correct loops etc to create and style the table, i thought i best ask if there are already good jquery plugins / php classes create this visual interface and uncluttered enough to allow me to add a select date range method (like Google Analytics)? I'm looking for something to create a basic calendar table on a loop, which i can then style and add jquery features. Though worth asking if there's a complete package that already does what i plan on making?!! (jquery ui's date-picker doesn't look powerful enough / easy enough to modify)

    Read the article

  • white-label collaborative open-source development (e.g. github/sourceforge/google-code in a box) ?

    - by Justin Grant
    Does anyone have a recommendation for an open-source or paid (either packaged or SaaS) solution for integrating collaborative development features into your own website? Here's more details: We currently host an online plugin gallery for our product. Users can upload and download plugins. But users can't easily collaborate on a plugin's development, can't easily report and track bugs on a plugin, can't easily track a plugin's versions or roadmap, etc. Of course, contributors can host their plugin development on github, sourceforge, google code, codeplex, etc. But keeping users on our website has some advantages. For example: We can use single-sign-on to avoid yet another username/password required we can integrate end-user issue tracking into our existing online issue-tracking systems we can get integrated analytics so we can better meet the needs of top contributors as well as downloaders We can easily reward reputation points to committers just like we do for people who answer lots of questions Anyone know a good solution for white-label sites for open-source project developer collaboration?

    Read the article

  • Storing millions of URLs in a database for fast pattern matching

    - by Paras Chopra
    I am developing a web analytics kind of system which needs to log referring URL, landing page URL and search keywords for every visitor on the website. What I want to do with this collected data is to allow end-user to query the data such as "Show me all visitors who came from Bing.com searching for phrase that contains 'red shoes'" or "Show me all visitors who landed on URL that contained 'campaign=twitter_ad'", etc. Because this system will be used on many big websites, the amount of data that needs to log will grow really, really fast. So, my question: a) what would be the best strategy for logging so that scaling the system doesn't become a pain; b) how to use that architecture for rapid querying of arbitrary requests? Is there a special method of storing URLs so that querying them gets faster? In addition to MySQL database that I use, I am exploring (and open to) other alternatives better suited for this task.

    Read the article

  • Can't switch tab and replace value of an input and call a f() same time.

    - by marharépa
    Hi! I feel sorry of askin all my little thingys here, but i can't find the answer via google. :( I'd like to switch tabs, replace an input value and call a function by one click. THE JS: function ApplyTableId(id) { var $tabs = $('#tabs').tabs(); $('a.stat').click(function() { $tabs.tabs('select', 2); // switch to third tab }); $('tableId').val('ga:'+id); // replace the input with id=tableId's val getAccountFeed(); // call an other function } The another JS, which will be called by the first script: function getAccountFeed() { var myFeedUri = 'https://www.google.com/analytics/feeds/accounts/default?max-results=50'; myService.getAccountFeed(myFeedUri, handleAccountFeed, handleError); } This is what i want to call, and here is the HTML: TAB1: <a class="stat" onClick="return ApplyTableId(this.getAttribute('id'));" id="7777777" />asd</a> TAB3: <input type="text" value="asd" id="tableId"/> Please tell me, what i did wrong :(

    Read the article

  • iOS - Application logging test and production code

    - by Peter Warbo
    I am doing a bunch of logging when I'm testing my application which is useful for getting information about variable state and such. However I have read that you should use logging sparsely in production code (because it can potentially slow down your application). But my question is now: if my app is in production and people are using it, whenever a crash (god forbid) occurs, how will I be able to interpret the crash information if I have removed the logging statements? Then I suppose I will only have a stacktrace for me to interpret? Does this mean I should leave logging in production code only WHERE it's really essential for me to interpret what has happened? Also how will the logging statements relate to the crash reports? Will they be combined? I'm thinking of using Flurry as analytics and crash reports...

    Read the article

  • How to record when user follows external links without slowing user down

    - by taw
    I want to track when user clicks external links for analytics purposes. The simplest solution is to replace all external links with links to special record-and-redirect controller, but that would slow the user unnecessarily. The second idea would be to override click event and within in $.post a message to record controller, then let the main event handler happen, which will usually be either click (open link in same tab) or middle click (open in new tab) - good either way, and the user won't have to wait for wait for my server to record it, it's fire-and-forget. (I don't care if users without Javascript don't get tracked) Is that a reasonable way to go? Or what else would be the best way to track all external link clicks?

    Read the article

  • How would you measure conversion for an iPhone App Download?

    - by Eran Kampf
    I want to test how users convert from my web page to downloading my iPhone app. Right now the best funnel I figured out was to go through a page that pings Google Analytics and then redirects to an iTunes link. But this funnel only measures conversion for users who got redirected to iTunes and not if they actually downloaded the app once they saw it in iTunes. Anyone knows of a way to measure conversion to the actual download? (I know this is not a coding question but still a problem a app developers would encounter trying to market their app)

    Read the article

  • Read Xlement fully using LinQ

    - by Ramya
    Hi, I have an XElement which I am getting after parsing an xml. This XElement needs to be read only when the need arises so, I have stored it in a list for future use. I have to read this XElement using LinQ. XDocument doc = XDocument.Parse(DataManager.offeringElements[index].DataElem.ToString()); var docNode = from dataNode in doc.Descendants("DataLinks") select new { Offering = dataNode .Element("link").Value, linkUrl = dataNode.Element("link").Attribue ("href").Value }; the Xelement has the following nodes a. Management b. Analytics c. Development My problem is that I am not able to read all the three nodes.I am able to get only the first node. Where am I going wrong?

    Read the article

  • Question about [literally] mapping location based on an IP address.

    - by Andrew
    I've been bored lately and I want to start a new project. I was looking at a website mentioned in a different question (http://www.grapevinegame.com/), and I thought the map and how it plots a point based on someone's IP (I assume) is pretty nifty. I want to do something like that, but I have no idea how it's done. I know you can get latitude and longitude, city and state, and some more with some already-written scripts, but how would you plot those on a map of the world? I've seen it other places, like Google Analytics and such, as well. It seems like a neat thing to be able to do, so I was just wondering how exactly it works. :-p.

    Read the article

  • Is there a SaaS for logging user activity?

    - by JoshL
    In almost every app that I build I create some kind of user log table to log various activities that my actual USERS (not visitors, but someone with an account) perform on the site. This is primarily used for customer service issues to allow me to pull up a record of the pages and actions that a user has visited. The downside to this is the size of the UserLogs table. It gets immense. I'm not sure if it is common practice or not for others to log INDIVIDUAL (not aggregate like Google Analytics) user behavior to a database, but if it is I'm wondering if any form of a SaaS exists to help offload this task? I essentially need a RESTful API that lets me store and retrieve individual user activity quickly and securely. Anyone know of any or am I the only one who has this issue?

    Read the article

  • RemoteWebDriver InternetExplorer navigate().to() timeout?

    - by the qwerty
    i was running a test remotely on internet explorer, and when using navigate().to() selenium returns me this: "12:13:58.770 INFO - WebDriver remote server: Exception: The driver reported that the command timed out. There may be several reasons for this. Check that the destinationsite is in IE's 'Trusted Sites' (accessed from Tools-Internet Options in the 'Security' tab) If it is a trusted site, then the request may have taken more thana minute to finish." i've done what's said. but when looking at the browsers the page is loaded, but still this message continues. i've already tried as simon told me: "(16:32:54) simonstewart: ponto: http://code.google.com/p/selenium/wiki/FrequentlyAskedQuestions#Q:_The_does_not_work_well_on_Vista._How_do_I_get_it_to_work_as_e " but did not solve. could it be google analytics that on the background is getting data or something like that? ps: i ran the test on firefox and it works well. i've tried on Windows 7 and Windows XP, and Internet Explorer 7 and Internet Explorer 8.

    Read the article

  • Produce a script to hit Google once a day and log our position in the results?

    - by hawbsl
    The need has arisen within our organisation to monitor (on a daily basis) where our site appears (both organic and PPC) on the page 1 of Google. Also where a key competitor appears. For certain key words. In the immediate short term a colleague is doing this by hitting Google manually and jotting down the results. Yep. It occurs to us we can write a script (e.g. using C#) to do this. I know Analytics will tell us an awful lot but it doesn't note the competitor's position, plus I don't think it has other data we want. Question is, is there an existing basic tool which does this (for free, I guess)? And if we write it ourselves, where to start and are there obvious pitfalls to avoid (for example can Google detect and block automated requests?)

    Read the article

  • Store data in Ruby on Rails without Database

    - by snowmaninthesun
    I have a few data values that I need to store on my rails app and wanted to know if there are any alternatives to creating a database table just to do this simple task. Background: I'm writing some analytics and dashboard tools for my ruby on rails app and i'm hoping to speed up the dashboard by caching results that will never change. Right now I pull all users for the last 30 days, and re arange them so I can see the number of new users per day. It works great but takes quite a long time, in reality I should only need to calculate the most recent day and just store the rest of the array somewhere else. Where is the best way to store this array? Creating a database table seems a bit overkill, and i'm not sure that global variables are the correct answer. Is there a best practice for persisting data like this? If anyone has done anything like this before let me know what you did and how it turned out.

    Read the article

< Previous Page | 112 113 114 115 116 117 118 119 120 121 122 123  | Next Page >