Search Results

Search found 1603 results on 65 pages for 'analytics'.

Page 26/65 | < Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >

  • How to do Google Analytics for ajax call in Facebook?

    - by seatoskyhk
    I have a FBML game and need to track all ajax calls. At the bottom of the page, I have this: and in the javascript functions, I put this: Facebook.urchinTracker("/ajaxcallname/"); However, it doesn't work.. and what I found out is that the utmac (the google account id) is empty!!! _uacct is empty! even I set it in the FBML, it is still empty!! and I can't find a way to dynamically set _uacct be my google account ID.. any one has any idea?

    Read the article

  • Firefox hangs waiting for ssl.google-analytics.com

    - by squillman
    It seems that FF has a problem with 403 Access Denied responses from proxies, at least for ssl.google-analytics.com. I've found this post which describes my problem. I'm posting my workaround as an answer, but would also welcome any more information if anyone has it as I can't find anything! EDIT: Note that the current version of Firefox which is experiencing this issue is 3.0.10 EDIT: Still there for FF 3.5...

    Read the article

  • Two different domains as one user session

    - by Mathew Foscarini
    I have two websites that are run as the same service. Each domain offers articles from a different market. At the top of each page the two domains are shown as menu options. If a user clicks one they can switch to the other domain. See here: http://www.cgtag.com Each domain has a different Google Analytics account, and when a user switches domains Google is counting this as a new session. It's listing the other domain as the "referral" for that new session. When the user switches back to the first domain Google is counting this as a returning visitor. This is messing up my reports. Showing returning visitors values that are higher than reality. It's also increasing hits on landing pages when the user switches, and listing the other domain as a referral site. I've found tips on how to list two domains as one website, but that results in merging the data. I want to keep the two domains separate so that I can track each ones performance, but I don't want to count domain changes as new sessions. Maybe something like treating the two domains as subdomains.

    Read the article

  • Google reverse an analytic

    - by Dan
    I am confused about what code must be executed to reverse a google analytic. I have the following code pasted within a test page: <body onLoad=”function()”> <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-25305776-3']); _gaq.push(['_trackPageview']); _gaq.push(['_addTrans', '11455', // order ID - required '-42.38', // total - required '-2.38', // tax '-15.00' // shipping ]); _gaq.push(['_addItem', '11455', // order ID - necessary to associate item with transaction 'Evan Turner Turningpoint™ Basketball Pants', // product name '25.00', // unit price - required '-1' // quantity - required ]); _gaq.push(['_trackTrans']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> Is this correct? Thanks!

    Read the article

  • Track sales and commission with third-party tool

    - by Andrew
    I have a clothing website where I link to various clothing retailers. I have reached an agreement with one of the retailers whereby they will pay a commission to us for every sale they make from traffic that was referred by our site. I need a mechanism for tracking how much commission should be paid to us, that involves as little work as possible to implement from their side. We both have Google Analytics. Option 1: They record a goal in their GA account whenever someone makes a purchase on their site. They see how many completed goals are marked as referral traffic from our site and calculate commission accordingly. The problem with this is that the whole process of calculating and paying commission will be manual. They will need to frequently check how many sales were generated by referral traffic from our site, and probably we will have to chase them for commission payments. Also - since we won't have access to their GA data - we will need to trust that they report all sales accurately. Option 2: Sign them up to an affiliate network like Commission Junction or Google's Affiliate Network, and connect to them through this network. The problem with this solution is that it seems too heavyweight; ideally we don't want to ask a retailer to go through the whole sign up process just to deal with us and pay us commission. I am assuming that there must be some lightweight service that tracks the number of sales by one site and pays commission accordingly to the other site, where the sign up and installation procedure is simple and fast.

    Read the article

  • Building a DVR system for use with custom windows application (video analytics)

    - by Michael
    Is there a good PCIe DVR capture card that has at least 4 channels as well as the hardware encoding? It would have to have decent driver support in Windows xp or windows 7. I have looked at various video capture cards as well as an integrated video capture card/motherboard from Huperlabs. But so far I have not found one with a decent review and that has good driver support that I can verify. A really small card would be nice because I am trying to get a fairly small form factor. Huperlabs stuff is pretty awesome but they are slow to get back to me and they bundle their analytics software with the hardware (extra cost for nothing) The dvr is being used for security.

    Read the article

  • Smarter, Faster, Cheaper: The Insurance Industry’s Dream

    - by Jenna Danko
    On June 3rd, I saw the Gaylord Resort Centre in Washington D.C. become the hub of C level executives and managers of insurance carriers for the IASA 2013 Conference.  Insurance Accounting/Regulation and Technology sessions took the focus, but there were plenty of tertiary sessions for career development, which complemented the overall strong networking side of the conference.  As an exhibitor, Oracle, along with several hundred other product providers, welcomed the opportunity to display and demonstrate our solutions and we were encouraged by hustle and bustle of the exhibition floor.  The IASA organizers had pre-arranged fast track tours whereby interested conference delegates could sign up for a series of like-themed presentations from Vendors, giving them a level of 'Speed Dating' introductions to possible solutions and services.  Oracle participated in a number of these, which were very well subscribed.  Clearly, the conference had a strong business focus; however, attendees saw technology as a key enabler to get their processes done smarter, faster and cheaper.  As we navigated through the exhibition, it became clear from the inquiries that came to us that insurance carriers are gravitating to a number of focus areas: Navigating the maze of upcoming regulatory reporting changes. For US carriers with European holdings, Solvency II carries a myriad of rules and reporting requirements. Alignment across the globe of the Own Risk and Solvency Assessment (ORSA) processes brings to the fore the National Insurance of Insurance commissioners' (NAIC) recent guidance manual publication. Doing more with less and to certainly expect more from technology for less dollars. The overall cost of IT, in particular hardware, has dropped in real terms (though the appetite for more has risen: more CPU, more RAM, more storage), but software has seen less change. Clearly, customers expect either to pay less or get a lot more from their software solutions for the same buck. Doing things smarter – A recognition that with the advance of technology to stand still no longer means you are technically going backwards. Technology and, in particular technology interactions with human business processes, has undergone incredible change over the past 5 years. Consumer usage (iPhones, etc.) has been at the forefront, but now at the Enterprise level ever more effective technology exploitation is beginning to take place. That data and, in particular gleaning knowledge from data, is refining and improving business processes.  Organizations are now consuming more data than ever before, and it is set to grow exponentially for some time to come.  Amassing large volumes of data is one thing, but effectively analyzing that data is another.  It is the results of such analysis that leads to improvements both in terms of insurance product offerings and the processes to support them. Regulatory Compliance, damned if you do and damned if you don’t! Clearly, around the globe at lot is changing from a regulatory perspective and it is evident that in terms of regulatory requirements, whilst there is a greater convergence across jurisdictions bringing uniformity, there is also a lot of work to be done in the next 5 years. Just like the big data, hidden behind effective regulatory compliance there often lies golden nuggets that can give competitive advantages. From Oracle's perspective, our Rating Engine, Billing, Document Management and Insurance Analytics solutions on display served to strike up good conversations and, as is always the case at conferences, it was a great opportunity to meet and speak with existing Oracle customers that we might not have otherwise caught up with for a while. Fortunately, I was able to catch up on a few sessions at the close of the Exhibition.  The speaker quality was high and the audience asked challenging, but pertinent, questions.  During Dr. Jackie Freiberg’s keynote “Bye Bye Business as Usual,” the author discussed 8 strategies to help leaders create a culture where teams consistently deliver innovative ideas by disrupting the status quo.  The very first strategy: Get wired for innovation.  Freiberg admitted that folks in the insurance and financial services industry understand and know innovation is important, but oftentimes they are slow adopters.  Today, technology and innovation go hand in hand. In speaking to delegates during and after the conference, a high degree of satisfaction could be measured from their positive comments of speaker sessions and the exhibitors. I suspect many will be back in 2014 with Indianapolis as the conference location. Did you attend the IASA Conference in Washington D.C.?  If so, I would love to hear your comments. Andrew Collins is the Director, Solvency II of Oracle Financial Services. He can be reached at andrew.collins AT oracle.com.

    Read the article

  • Quadratic Programming with Oracle R Enterprise

    - by Jeff Taylor-Oracle
         I wanted to use quadprog with ORE on a server running Oracle Solaris 11.2 on a Oracle SPARC T-4 server For background, see: Oracle SPARC T4-2 http://docs.oracle.com/cd/E23075_01/ Oracle Solaris 11.2 http://www.oracle.com/technetwork/server-storage/solaris11/overview/index.html quadprog: Functions to solve Quadratic Programming Problems http://cran.r-project.org/web/packages/quadprog/index.html Oracle R Enterprise 1.4 ("ORE") 1.4 http://www.oracle.com/technetwork/database/options/advanced-analytics/r-enterprise/ore-downloads-1502823.html Problem: path to Solaris Studio doesn't match my installation: $ ORE CMD INSTALL quadprog_1.5-5.tar.gz * installing to library \u2018/u01/app/oracle/product/12.1.0/dbhome_1/R/library\u2019 * installing *source* package \u2018quadprog\u2019 ... ** package \u2018quadprog\u2019 successfully unpacked and MD5 sums checked ** libs /opt/SunProd/studio12u3/solarisstudio12.3/bin/f95 -m64   -PIC  -g  -c aind.f -o aind.o bash: /opt/SunProd/studio12u3/solarisstudio12.3/bin/f95: No such file or directory *** Error code 1 make: Fatal error: Command failed for target `aind.o' ERROR: compilation failed for package \u2018quadprog\u2019 * removing \u2018/u01/app/oracle/product/12.1.0/dbhome_1/R/library/quadprog\u2019 $ ls -l /opt/solarisstudio12.3/bin/f95 lrwxrwxrwx   1 root     root          15 Aug 19 17:36 /opt/solarisstudio12.3/bin/f95 -> ../prod/bin/f90 Solution: a symbolic link: $ sudo mkdir -p /opt/SunProd/studio12u3 $ sudo ln -s /opt/solarisstudio12.3 /opt/SunProd/studio12u3/ Now, it is all good: $ ORE CMD INSTALL quadprog_1.5-5.tar.gz * installing to library \u2018/u01/app/oracle/product/12.1.0/dbhome_1/R/library\u2019 * installing *source* package \u2018quadprog\u2019 ... ** package \u2018quadprog\u2019 successfully unpacked and MD5 sums checked ** libs /opt/SunProd/studio12u3/solarisstudio12.3/bin/f95 -m64   -PIC  -g  -c aind.f -o aind.o /opt/SunProd/studio12u3/solarisstudio12.3/bin/ cc -xc99 -m64 -I/usr/lib/64/R/include -DNDEBUG -KPIC  -xlibmieee  -c init.c -o init.o /opt/SunProd/studio12u3/solarisstudio12.3/bin/f95 -m64  -PIC -g  -c -o solve.QP.compact.o solve.QP.compact.f /opt/SunProd/studio12u3/solarisstudio12.3/bin/f95 -m64  -PIC -g  -c -o solve.QP.o solve.QP.f /opt/SunProd/studio12u3/solarisstudio12.3/bin/f95 -m64   -PIC  -g  -c util.f -o util.o /opt/SunProd/studio12u3/solarisstudio12.3/bin/ cc -xc99 -m64 -G -o quadprog.so aind.o init.o solve.QP.compact.o solve.QP.o util.o -xlic_lib=sunperf -lsunmath -lifai -lsunimath -lfai -lfai2 -lfsumai -lfprodai -lfminlai -lfmaxlai -lfminvai -lfmaxvai -lfui -lfsu -lsunmath -lmtsk -lm -lifai -lsunimath -lfai -lfai2 -lfsumai -lfprodai -lfminlai -lfmaxlai -lfminvai -lfmaxvai -lfui -lfsu -lsunmath -lmtsk -lm -L/usr/lib/64/R/lib -lR installing to /u01/app/oracle/product/12.1.0/dbhome_1/R/library/quadprog/libs ** R ** preparing package for lazy loading ** help *** installing help indices   converting help for package \u2018quadprog\u2019     finding HTML links ... done     solve.QP                                html      solve.QP.compact                        html  ** building package indices ** testing if installed package can be loaded * DONE (quadprog) ====== Here is an example from http://cran.r-project.org/web/packages/quadprog/quadprog.pdf > require(quadprog) > Dmat <- matrix(0,3,3) > diag(Dmat) <- 1 > dvec <- c(0,5,0) > Amat <- matrix(c(-4,-3,0,2,1,0,0,-2,1),3,3) > bvec <- c(-8,2,0) > solve.QP(Dmat,dvec,Amat,bvec=bvec) $solution [1] 0.4761905 1.0476190 2.0952381 $value [1] -2.380952 $unconstrained.solution [1] 0 5 0 $iterations [1] 3 0 $Lagrangian [1] 0.0000000 0.2380952 2.0952381 $iact [1] 3 2 Here, the standard example is modified to work with Oracle R Enterprise require(ORE) ore.connect("my-name", "my-sid", "my-host", "my-pass", 1521) ore.doEval(   function () {     require(quadprog)   } ) ore.doEval(   function () {     Dmat <- matrix(0,3,3)     diag(Dmat) <- 1     dvec <- c(0,5,0)     Amat <- matrix(c(-4,-3,0,2,1,0,0,-2,1),3,3)     bvec <- c(-8,2,0)    solve.QP(Dmat,dvec,Amat,bvec=bvec)   } ) $solution [1] 0.4761905 1.0476190 2.0952381 $value [1] -2.380952 $unconstrained.solution [1] 0 5 0 $iterations [1] 3 0 $Lagrangian [1] 0.0000000 0.2380952 2.0952381 $iact [1] 3 2 Now I can combine the quadprog compute algorithms with the Oracle R Enterprise Database engine functionality: Scale to large datasets Access to tables, views, and external tables in the database, as well as those accessible through database links Use SQL query parallel execution Use in-database statistical and data mining functionality

    Read the article

  • Big GRC: Turning Data into Actionable GRC Intelligence

    - by Jenna Danko
    While it’s no longer headline news that Governments have carried out large scale data-mining programmes aimed at terrorism detection and identifying other patterns of interest across a wide range of digital data sources, the debate over the ethics and justification over this action, will clearly continue for some time to come. What is becoming clear is that these programmes are a framework for the collation and aggregation of massive amounts of unstructured data and from this, the creation of actionable intelligence from analyses that allowed the analysts to explore and extract a variety of patterns and then direct resources. This data included audio and video chats, phone calls, photographs, e-mails, documents, internet searches, social media posts and mobile phone logs and connections. Although Governance, Risk and Compliance (GRC) professionals are not looking at the implementation of such programmes, there are many similar GRC “Big data” challenges to be faced and potential lessons to be learned from these high profile government programmes that can be applied a lot closer to home. For example, how can GRC professionals collect, manage and analyze an enormous and disparate volume of data to create and manage their own actionable intelligence covering hidden signs and patterns of criminal activity, the early or retrospective, violation of regulations/laws/corporate policies and procedures, emerging risks and weakening controls etc. Not exactly the stuff of James Bond to be sure, but it is certainly more applicable to most GRC professional’s day to day challenges. So what is Big Data and how can it benefit the GRC process? Although it often varies, the definition of Big Data largely refers to the following types of data: Traditional Enterprise Data – includes customer information from CRM systems, transactional ERP data, web store transactions, and general ledger data. Machine-Generated /Sensor Data – includes Call Detail Records (“CDR”), weblogs and trading systems data. Social Data – includes customer feedback streams, micro-blogging sites like Twitter, and social media platforms like Facebook. The McKinsey Global Institute estimates that data volume is growing 40% per year, and will grow 44x between 2009 and 2020. But while it’s often the most visible parameter, volume of data is not the only characteristic that matters. In fact, according to sources such as Forrester there are four key characteristics that define big data: Volume. Machine-generated data is produced in much larger quantities than non-traditional data. This is all the data generated by IT systems that power the enterprise. This includes live data from packaged and custom applications – for example, app servers, Web servers, databases, networks, virtual machines, telecom equipment, and much more. Velocity. Social media data streams – while not as massive as machine-generated data – produce a large influx of opinions and relationships valuable to customer relationship management as well as offering early insight into potential reputational risk issues. Even at 140 characters per tweet, the high velocity (or frequency) of Twitter data ensures large volumes (over 8 TB per day) need to be managed. Variety. Traditional data formats tend to be relatively well defined by a data schema and change slowly. In contrast, non-traditional data formats exhibit a dizzying rate of change. Without question, all GRC professionals work in a dynamic environment and as new services, new products, new business lines are added or new marketing campaigns executed for example, new data types are needed to capture the resultant information.  Value. The economic value of data varies significantly. Typically, there is good information hidden amongst a larger body of non-traditional data that GRC professionals can use to add real value to the organisation; the greater challenge is identifying what is valuable and then transforming and extracting that data for analysis and action. For example, customer service calls and emails have millions of useful data points and have long been a source of information to GRC professionals. Those calls and emails are critical in helping GRC professionals better identify hidden patterns and implement new policies that can reduce the amount of customer complaints.   Now on a scale and depth far beyond those in place today, all that unstructured call and email data can be captured, stored and analyzed to reveal the reasons for the contact, perhaps with the aggregated customer results cross referenced against what is being said about the organization or a similar peer organization on social media. The organization can then take positive actions, communicating to the market in advance of issues reaching the press, strengthening controls, adjusting risk profiles, changing policy and procedures and completely minimizing, if not eliminating, complaints and compensation for that specific reason in the future. In this one example of many similar ones, the GRC team(s) has demonstrated real and tangible business value. Big Challenges - Big Opportunities As pointed out by recent Forrester research, high performing companies (those that are growing 15% or more year-on-year compared to their peers) are taking a selective approach to investing in Big Data.  "Tomorrow's winners understand this, and they are making selective investments aimed at specific opportunities with tangible benefits where big data offers a more economical solution to meet a need." (Forrsights Strategy Spotlight: Business Intelligence and Big Data, Q4 2012) As pointed out earlier, with the ever increasing volume of regulatory demands and fines for getting it wrong, limited resource availability and out of date or inadequate GRC systems all contributing to a higher cost of compliance and/or higher risk profile than desired – a big data investment in GRC clearly falls into this category. However, to make the most of big data organizations must evolve both their business and IT procedures, processes, people and infrastructures to handle these new high-volume, high-velocity, high-variety sources of data and be able integrate them with the pre-existing company data to be analyzed. GRC big data clearly allows the organization access to and management over a huge amount of often very sensitive information that although can help create a more risk intelligent organization, also presents numerous data governance challenges, including regulatory compliance and information security. In addition to client and regulatory demands over better information security and data protection the sheer amount of information organizations deal with the need to quickly access, classify, protect and manage that information can quickly become a key issue  from a legal, as well as technical or operational standpoint. However, by making information governance processes a bigger part of everyday operations, organizations can make sure data remains readily available and protected. The Right GRC & Big Data Partnership Becomes Key  The "getting it right first time" mantra used in so many companies remains essential for any GRC team that is sponsoring, helping kick start, or even overseeing a big data project. To make a big data GRC initiative work and get the desired value, partnerships with companies, who have a long history of success in delivering successful GRC solutions as well as being at the very forefront of technology innovation, becomes key. Clearly solutions can be built in-house more cheaply than through vendor, but as has been proven time and time again, when it comes to self built solutions covering AML and Fraud for example, few have able to scale or adapt appropriately to meet the changing regulations or challenges that the GRC teams face on a daily basis. This has led to the creation of GRC silo’s that are causing so many headaches today. The solutions that stand out and should be explored are the ones that can seamlessly merge the traditional world of well-known data, analytics and visualization with the new world of seemingly innumerable data sources, utilizing Big Data technologies to generate new GRC insights right across the enterprise.Ultimately, Big Data is here to stay, and organizations that embrace its potential and outline a viable strategy, as well as understand and build a solid analytical foundation, will be the ones that are well positioned to make the most of it. A Blueprint and Roadmap Service for Big Data Big data adoption is first and foremost a business decision. As such it is essential that your partner can align your strategies, goals, and objectives with an architecture vision and roadmap to accelerate adoption of big data for your environment, as well as establish practical, effective governance that will maintain a well managed environment going forward. Key Activities: While your initiatives will clearly vary, there are some generic starting points the team and organization will need to complete: Clearly define your drivers, strategies, goals, objectives and requirements as it relates to big data Conduct a big data readiness and Information Architecture maturity assessment Develop future state big data architecture, including views across all relevant architecture domains; business, applications, information, and technology Provide initial guidance on big data candidate selection for migrations or implementation Develop a strategic roadmap and implementation plan that reflects a prioritization of initiatives based on business impact and technology dependency, and an incremental integration approach for evolving your current state to the target future state in a manner that represents the least amount of risk and impact of change on the business Provide recommendations for practical, effective Data Governance, Data Quality Management, and Information Lifecycle Management to maintain a well-managed environment Conduct an executive workshop with recommendations and next steps There is little debate that managing risk and data are the two biggest obstacles encountered by financial institutions.  Big data is here to stay and risk management certainly is not going anywhere, and ultimately financial services industry organizations that embrace its potential and outline a viable strategy, as well as understand and build a solid analytical foundation, will be best positioned to make the most of it. Matthew Long is a Financial Crime Specialist for Oracle Financial Services. He can be reached at matthew.long AT oracle.com.

    Read the article

  • Ctrl+Click / Command+Click not working with analytics

    - by user347998
    Hi All, I created my own analytics for my site to track outbound click events using jquery. Now the thing with preventDefault() is that it does not allow for the Ctrl+Click or COmmand+click operation in the browser to open the link in new tab/window. So my solution was to detect e.metaKey || e.ctrlKey and use window.open. This does not work very great with safari unless the user changes browser behavior. I am wondering if anyone here knows what other analytics users do - like how does google etc deal with this problem in tracking outbound links? From this link: http://www.google.com/support/googleanalytics/bin/answer.py?hl=en&answer=55527 - looks like google will also face the same problem. Thoughts?

    Read the article

  • Google analytics-style custom report builder UI

    - by gregmac
    I'm looking for a reporting engine/UI that can be integrated into a product, which has a UI along the lines of Google Analytics' Custom Reports builder. Is anyone aware of such a thing? The data is in our case is not page views/visitors/etc, but is similar in nature, in that there are limited entities or types of data, but each entity has many attributes/columns and many different ways of aggregating data (or in GA-style speak, metrics and dimensions). The analytics-style UI is very intuitive and allows many reports to be created in powerful ways, without having to know SQL. I have preference for a web-based tool (seeing that it is 2010 and this is a web app -- I mention only because it seems the vast majority of reporting tools still have only a non-web-based creation tool).

    Read the article

  • blacklist test requests from google analytics using watir

    - by Anjali
    Hi, I have to automate tests for a web application which runs google analytics script. I have chosen watir for the automation since I can script all the test cases with the same. The only problem is i dont know how to remove my test requests to the web apps from the google analytics report. Can anyone help me with the same? Is it possible to do that with watir? If not watir, is there any other web automation tool which I could use? ~Thanks and Regarads

    Read the article

  • Firefox: how to block cookies by name, not by site?

    - by deepc
    Firefox allows to block all cookies on a site-by-site level. This is ok for the most part. However, it does not help with blocking only Google Analytics cookies. The GA cookie names start with __ut. Is there a Firefox add-on which can block all __ut* cookies? I know there are many cookie related add-ons for Firefox - but apparently all of them simply fine tune cookie site-by-site blocking, according to their descriptions. Hopefully I missed the one who can do this... I also know about Google's plugin to opt out of analytics. Installing a specific plug-in for that purpose (as opposed to an add-on) seems a bit overdone. Plus, I would have to trust Google with that and that is exactly what I don't.

    Read the article

  • Google et Adobe se mettent au Social Analytics et proposent de chiffrer les retombées des actions marketing sur les réseaux sociaux

    Google et Adobe se mettent au Social Analytics Et proposent d'évaluer concrètement les retours sur investissement des actions marketing sur les réseaux sociaux Combien rapporte une campagne ou une opération marketing sur sur les réseaux sociaux ? Aujourd'hui, évaluer un tel ROI est un des défis pour les Webmarketeurs. Les managers et les décideurs veulent en effet de plus en plus intégrer Facebook ou Tweeter dans leurs stratégies de communication, mais ils veulent aussi savoir ce que cela rapporte. Google l'a bien compris et devrait proposer dans les semaines qui viennent de nouveaux indicateurs à Google Analytics. En résumé, Analytics pourra tracer les visites entra...

    Read the article

  • 600 visitors per day, 20 backlinks but still not referenced by Google

    - by Tristan
    Hello, i've launch a website on wednesday (9 August) There's already in 4 days 12,000 viewed pages 1,400 visits 20 to 25 backlincks a sitemap.xml (130 URL) english language / french language - url like that "/en/" "/fr/" BUT, i'm still not referrenced by google In the google webmaster i have 0 backlinks 130 in sitemap, 0 URL referrenced on google For a smaller website, i remember that i took me less to appear on google with less visits. My url is: http://www.seek-team.com/en/ for english and juste replace /en/ by /fr/ to access it in french. What's causing this ? Is there an explanation ? Thanks for your help (ps, i've already checked robots.txt)

    Read the article

  • What is the maximum number of characters in the utm_content param in GA?

    - by David Parks
    For example, we want to differentiate people who followed our daily product email blast. I could use the product ID for utm_content, but it would be easier to read to use the SEO friendly URL path, such as: http://www.oursite.com/products/really-great-new-product https://www.frugg.com/? utm_source=a&utm_medium=b& utm_term=c& utm_content=Can-I-use-a-really-long-content-tag-like-this-one-or-is-this-going-to-break-something& utm_campaign=d

    Read the article

  • Analiytics: Can I set a goal on multiple events?

    - by David Parks
    We have a popup dialogue that requests users email address or facebook login. The page behind the popup loads, so a page view is counted. We want to measure: How many users ignored the popup completely How many users engaged the popup, but don't complete the process (we trigger an event when the user performs actions defined as "engaging") How many users completed the popup Bounce rates aren't telling because some users won't receive the popup. We are basically triggering events "PopupDisplayed" "PopupEngaged" and "PopupComplete", with labels to differentiate between email and facebook. But I don't think I can set goals to count "Users who received 'PopupDisplayed' AND 'PopupComplete'" events, so I can count how many users both saw the popup and completed it.

    Read the article

  • How can I parse Amazon S3 log files?

    - by artlung
    What are the best options for parsing Amazon S3 (Simple Storage) log files? I've turned on logging and now I have log files that look like this: 858e709ba90996df37d6f5152650086acb6db14a67d9aaae7a0f3620fdefb88f files.example.com [08/Jul/2010:10:31:42 +0000] 68.114.21.105 65a011a29cdf8ec533ec3d1ccaae921c 13880FBC9839395C REST.GET.OBJECT example.com/blog/wp-content/uploads/2006/10/kitties_we_cant_stop_here_this_is_bat_country.jpg "GET /example.com/blog/wp-content/uploads/2006/10/kitties_we_cant_stop_here_this_is_bat_country.jpg HTTP/1.1" 200 - 32957 32957 12 10 "http://atlanta.craigslist.org/forums/?act=Q&ID=163218891" "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.19) Gecko/2010031422 Firefox/3.0.19" - What are the best options for automating the log files? I'm not using any other Amazon services other than S3.

    Read the article

  • Apache log analyzer which manages time spent to serve the request

    - by antispam
    I need to monitor performance in my web server (there's an application server in the back) and create reports for senior management. I've enabled %T/%D in my Apache logs and I would like to know if there's an Apache log analyzer or some other tool which parses these values and manages them showing charts or reports. I am looking mostly for an integrated solution and not in the line of awk+gnuplot scripts.

    Read the article

  • GA UA codes for testing site - set up

    - by Drew
    Anyone know the process for using a GA live UA code to test a site in development. I.e. I have a live site with a GA UA code attached, tracking live traffic data etc e.g. UA-123456. I've been told that there is a way to produce another code associated to the primary code to use on the testing version of my live site e.g. test code could be UA-123457. Can anyone shed some light on this? If not possible should I just set up a completely separate GA account for my testing site?

    Read the article

  • Tracking unique views for a site showing my advertisements [on hold]

    - by user580950
    I am in trouble. I placed and advertisement on a website in 2012. The website said they got 950,000 unique visits each month. Early in 2012 I advertised with them. The advertisement didn't worked out. I checked in 2-3 months time and I saw that the unique visitors on the site was 8,000 at that time. I immediately closed the account. I don't remember which site I used to check the unique visitors. The advertising company has filed a dispute against me. So is there any tool that can show me the 2012 stats for any website? I tried Google Trends but it doesn't show statistics.

    Read the article

  • Where can I find comscore rank?

    - by Joyce Babu
    Recently one ad network rejected my registration stating that my site doesn't match their minimum monthly impressions, even though the site serves thrice the required page views. When I contacted them for details, their representative hinted that they are using comscore data for screening submissions. Where can I view my site's comscore ranking and details? Update I was able to find the traffic by tagging my site with comScore Direct.

    Read the article

  • Can I remove visible referer from link?

    - by Andreas
    I use referer info to track which of my campaigns works the best. So instead of <a href="someweb.com">someweb</a> I have a link like <a href="http://someweb.com?utm_source=john&utm_medium=email&utm_content=NAME&utm_campaign=campaing">someweb</a> Now when a user cliks "someweb" the whole URL string is shown in the adressbar. Is this possible to mask/hide somehow? Maybe via .htaccss? Thanks in advance

    Read the article

  • The Virtues and Challenges of Implementing Basel III: What Every CFO and CRO Needs To Know

    - by Jenna Danko
    The Basel Committee on Banking Supervision (BCBS) is a group tasked with providing thought-leadership to the global banking industry.  Over the years, the BCBS has released volumes of guidance in an effort to promote stability within the financial sector.  By effectively communicating best-practices, the Basel Committee has influenced financial regulations worldwide.  Basel regulations are intended to help banks: More easily absorb shocks due to various forms of financial-economic stress Improve risk management and governance Enhance regulatory reporting and transparency In June 2011, the BCBS released Basel III: A global regulatory framework for more resilient banks and banking systems.  This new set of regulations included many enhancements to previous rules and will have both short and long term impacts on the banking industry.  Some of the key features of Basel III include: A stronger capital base More stringent capital standards and higher capital requirements Introduction of capital buffers  Additional risk coverage Enhanced quantification of counterparty credit risk Credit valuation adjustments  Wrong  way risk  Asset Value Correlation Multiplier for large financial institutions Liquidity management and monitoring Introduction of leverage ratio Even more rigorous data requirements To implement these features banks need to embark on a journey replete with challenges. These can be categorized into three key areas: Data, Models and Compliance. Data Challenges Data quality - All standard dimensions of Data Quality (DQ) have to be demonstrated.  Manual approaches are now considered too cumbersome and automation has become the norm. Data lineage - Data lineage has to be documented and demonstrated.  The PPT / Excel approach to documentation is being replaced by metadata tools.  Data lineage has become dynamic due to a variety of factors, making static documentation out-dated quickly.  Data dictionaries - A strong and clean business glossary is needed with proper identification of business owners for the data.  Data integrity - A strong, scalable architecture with work flow tools helps demonstrate data integrity.  Manual touch points have to be minimized.   Data relevance/coverage - Data must be relevant to all portfolios and storage devices must allow for sufficient data retention.  Coverage of both on and off balance sheet exposures is critical.   Model Challenges Model development - Requires highly trained resources with both quantitative and subject matter expertise. Model validation - All Basel models need to be validated. This requires additional resources with skills that may not be readily available in the marketplace.  Model documentation - All models need to be adequately documented.  Creation of document templates and model development processes/procedures is key. Risk and finance integration - This integration is necessary for Basel as the Allowance for Loan and Lease Losses (ALLL) is calculated by Finance, yet Expected Loss (EL) is calculated by Risk Management – and they need to somehow be equal.  This is tricky at best from an implementation perspective.  Compliance Challenges Rules interpretation - Some Basel III requirements leave room for interpretation.  A misinterpretation of regulations can lead to delays in Basel compliance and undesired reprimands from supervisory authorities. Gap identification and remediation - Internal identification and remediation of gaps ensures smoother Basel compliance and audit processes.  However business lines are challenged by the competing priorities which arise from regulatory compliance and business as usual work.  Qualification readiness - Providing internal and external auditors with robust evidence of a thorough examination of the readiness to proceed to parallel run and Basel qualification  In light of new regulations like Basel III and local variations such as the Dodd Frank Act (DFA) and Comprehensive Capital Analysis and Review (CCAR) in the US, banks are now forced to ask themselves many difficult questions.  For example, executives must consider: How will Basel III play into their Risk Appetite? How will they create project plans for Basel III when they haven’t yet finished implementing Basel II? How will new regulations impact capital structure including profitability and capital distributions to shareholders? After all, new regulations often lead to diminished profitability as well as an assortment of implementation problems as we discussed earlier in this note.  However, by requiring banks to focus on premium growth, regulators increase the potential for long-term profitability and sustainability.  And a more stable banking system: Increases consumer confidence which in turn supports banking activity  Ensures that adequate funding is available for individuals and companies Puts regulators at ease, allowing bankers to focus on banking Stability is intended to bring long-term profitability to banks.  Therefore, it is important that every banking institution takes the steps necessary to properly manage, monitor and disclose its risks.  This can be done with the assistance and oversight of an independent regulatory authority.  A spectrum of banks exist today wherein some continue to debate and negotiate with regulators over the implementation of new requirements, while others are simply choosing to embrace them for the benefits I highlighted above. Do share with me how your institution is coping with and embracing these new regulations within your bank. Dr. Varun Agarwal is a Principal in the Banking Practice for Capgemini Financial Services.  He has over 19 years experience in areas that span from enterprise risk management, credit, market, and to country risk management; financial modeling and valuation; and international financial markets research and analyses.

    Read the article

< Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >