Search Results

Search found 30345 results on 1214 pages for 'website analytics tools'.

Page 26/1214 | < Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >

  • .Net Analysis tools [closed]

    - by TWith2Sugars
    Possible Duplicate: What static analysis tools are available for C#? At work we tend to use two tools for analysing our projects, FxCop to analyse our managed code and StyleCop to have consistent code layout. I found these tools pretty much by accident and it has led me to wonder what other tools are available that I might of missed?

    Read the article

  • in google analytics, what is 'ga:accountName' for ?

    - by Chez
    In google analytics, what is 'ga:accountName' for ? it might seem like a straightforward question but I can't find anywhere some documentation which tells me what ga:accountName is supposed to return. if I run the google's code from the java example: private static void getAccountFeed(AnalyticsService analyticsService) throws IOException, MalformedURLException, ServiceException { // Construct query from a string. URL queryUrl = new URL( "https://www.google.com/analytics/feeds/accounts/default?max-results=10"); // Make request to the API. AccountFeed accountFeed = analyticsService.getFeed(queryUrl, AccountFeed.class); // Output the data to the screen. System.out.println("-------- Account Feed Results --------"); for (AccountEntry entry : accountFeed.getEntries()) { System.out.println( "\nAccount Name = " + entry.getProperty("ga:accountName") + "\nProfile Name = " + entry.getTitle().getPlainText() + "\nProfile Id = " + entry.getProperty("ga:profileId") + "\nTable Id = " + entry.getTableId().getValue()); } } it does return my website. can anybody help ? thanks

    Read the article

  • Use google analytics custom events for feedback form.

    - by chacko
    I was thinking of having a simple feedback form in my website. It would be something like: Your Feedback will help us improve. [ ] and then a textfield/textarea where the user can type (let's say) up to 100 characters of feedback. Rather than handling it all myself on the serverside I was thinking to use google analytics (since my site is already wireup) and everytime a user writes a comment send a custom event to google analytics. I think it might work. Can people suggest a better approach or point out any problem with this idea thanks

    Read the article

  • google analytics - event tracking without affecting bounce rate

    - by cmancre
    Hi, I'm studding a way of using Google Analytics to track Ad impressions/clicks. Looks like event tracking is the way to go. Tracking clicks it's easy to implement. My doubt resides on impressions: using event tracking on page load will cut down my bounce rate to 0 using a second profile doesn't look elegant (leave it for last resource) GOAL: John loads page A and leave. Count 1 for impressions and 1 for bounce rate as it should. Is there a way of doing it with Google Analytics?

    Read the article

  • How does Google Analytics save data to a database

    - by Pranz
    Hello everyone, I am making a Google Analytics like project for my school assignment. I have two questions primarily: 1) Exactly when does Google store the data to the database? When it does it use XHR with some server side scripting language to store it to the database or is there a way to do it using plain javascript? 2) How do I get the IP address of a user from Javascript? How does Google do for Analytics?? Thanks for all the help. Pranz

    Read the article

  • Download all the links from a website at once [closed]

    - by user216112
    Possible Duplicate: How can I download an entire website Is there any software that allows you to download all the links of a website at once? E.g.: I'm using the w3school.com site and want to download all the PHP tutorials at once. Someone told me "tglepote".bt I have no idea what it is and Google returns me with nothing.

    Read the article

  • How to register rss for a website?

    - by domainking
    I am not sure if I ask this question in the right place, because I am new to it. What I want to ask is, do I need to register/create RSS for my website? I have a website, lets say: [http://blog.domain.com] = its a 2.9.2 wordpress blog So, if I want to display the latest content in another subdomain, for example: [news.domain.com], how do I do that? I know a little bit of php and mysql.

    Read the article

  • Management and Monitoring Tools for Windows Azure

    - by BuckWoody
    With such a large platform, Windows Azure has a lot of moving parts. We’ve done our best to keep the interface as simple as possible, while giving you the most control and visibility we can. However, as with most Microsoft products, there are multiple ways to do something – and I’ve always found that to be a good strength. Depending on the situation, I might want a graphical interface, a command-line interface, or just an API so I can incorporate the management into my own tools, or have third-party companies write other tools. While by no means exhaustive, I thought I might put together a quick list of a few tools you can use to manage and monitor Windows Azure components, from our IaaS, SaaS and PaaS offerings. Some of the products focus on one area more than another, but all are available today. I’ll try and maintain this list to keep it current, but make sure you check the date of this post’s update – if it’s more than six months old, it’s most likely out of date. Things move fast in the cloud. The Windows Azure Management Portal The primary tool for managing Windows Azure is our portal – most everything you need is there, from creating new services to querying a database. There are two versions as of this writing – a Silverlight client version, and a newer HTML5 version. The latter is being updated constantly to be in parity with the Silverlight client. There’s a balance in this portal between simplicity and power – we’re following the “less is more” approach, with increasing levels of detail as you work through the portal rather than overwhelming you with a single, long “more is more” page. You can find the Portal here: http://windowsazure.com (then click “Log In” and then “Portal”) Windows Azure Management API You can also use programming tools to either write your own interface, or simply provide management functions directly within your solution. You have two options – you can use the more universal REST API’s, which area bit more complex but work with any system that can write to them, or the more approachable .NET API calls in code. You can find the reference for the API’s here: http://msdn.microsoft.com/en-us/library/windowsazure/ee460799.aspx  All Class Libraries, for each part of Windows Azure: http://msdn.microsoft.com/en-us/library/ee393295.aspx  PowerShell Command-lets PowerShell is one of the most powerful scripting languages I’ve used with Windows – and it’s baked into all of our products. When you need to work with multiple servers, scripting is really the only way to go, and the Windows Azure PowerShell Command-Lets allow you to work across most any part of the platform – and can even be used within the services themselves. You can do everything with them from creating a new IaaS, PaaS or SaaS service, to controlling them and even working with security and more. You can find more about the Command-Lets here: http://wappowershell.codeplex.com/documentation (older link, still works, will point you to the new ones as well) We have command-line utilities for other operating systems as well: https://www.windowsazure.com/en-us/manage/downloads/  Video walkthrough of using the Command-Lets: http://channel9.msdn.com/Events/BUILD/BUILD2011/SAC-859T  System Center System Center is actually a suite of graphical tools you can use to manage, deploy, control, monitor and tune software from Microsoft and even other platforms. This will be the primary tool we’ll recommend for managing a hybrid or contiguous management process – and as time goes on you’ll see more and more features put into System Center for the entire Windows Azure suite of products. You can find the Management Pack and README for it here: http://www.microsoft.com/en-us/download/details.aspx?id=11324  SQL Server Management Studio / Data Tools / Visual Studio SQL Server has two built-in management and development, and since Version 2008 R2, you can use them to manage Windows Azure Databases. Visual Studio also lets you connect to and manage portions of Windows Azure as well as Windows Azure Databases. You can read more about Visual Studio here: http://msdn.microsoft.com/en-us/library/windowsazure/ee405484  You can read more about the SQL tools here: http://msdn.microsoft.com/en-us/library/windowsazure/ee621784.aspx  Vendor-Provided Tools Microsoft does not suggest or endorse a specific third-party product. We do, however, use them, and see lots of other customers use them. You can browse to these sites to learn more, and chat with their folks directly on how they support Windows Azure. Cerebrata: Tools for managing from the command-line, graphical diagnostics, graphical storage management - http://www.cerebrata.com/  Quest Cloud Tools: Monitoring, Storage Management, and costing tools - http://communities.quest.com/community/cloud-tools  Paraleap: Monitoring tool - http://www.paraleap.com/AzureWatch  Cloudgraphs: Monitoring too -  http://www.cloudgraphs.com/  Opstera: Monitoring for Windows Azure and a Scale-out pattern manager - http://www.opstera.com/products/Azureops/  Compuware: SaaS performance monitoring, load testing -  http://www.compuware.com/application-performance-management/gomez-apm-products.html  SOASTA: Penetration and Security Testing - http://www.soasta.com/cloudtest/enterprise/  LoadStorm: Load-testing tool - http://loadstorm.com/windows-azure  Open-Source Tools This is probably the most specific set of tools, and the list I’ll have to maintain most often. Smaller projects have a way of coming and going, so I’ll try and make sure this list is current. Windows Azure MMC: (I actually use this one a lot) http://wapmmc.codeplex.com/  Windows Azure Diagnostics Monitor: http://archive.msdn.microsoft.com/wazdmon  Azure Application Monitor: http://azuremonitor.codeplex.com/  Azure Web Log: http://www.xentrik.net/software/azure_web_log.html  Cloud Ninja:Multi-Tennant billing and performance monitor -  http://cnmb.codeplex.com/  Cloud Samurai: Multi-Tennant Management- http://cloudsamurai.codeplex.com/    If you have additions to this list, please post them as a comment and I’ll research and then add them. Thanks!

    Read the article

  • Management and Monitoring Tools for Windows Azure

    - by BuckWoody
    With such a large platform, Windows Azure has a lot of moving parts. We’ve done our best to keep the interface as simple as possible, while giving you the most control and visibility we can. However, as with most Microsoft products, there are multiple ways to do something – and I’ve always found that to be a good strength. Depending on the situation, I might want a graphical interface, a command-line interface, or just an API so I can incorporate the management into my own tools, or have third-party companies write other tools. While by no means exhaustive, I thought I might put together a quick list of a few tools you can use to manage and monitor Windows Azure components, from our IaaS, SaaS and PaaS offerings. Some of the products focus on one area more than another, but all are available today. I’ll try and maintain this list to keep it current, but make sure you check the date of this post’s update – if it’s more than six months old, it’s most likely out of date. Things move fast in the cloud. The Windows Azure Management Portal The primary tool for managing Windows Azure is our portal – most everything you need is there, from creating new services to querying a database. There are two versions as of this writing – a Silverlight client version, and a newer HTML5 version. The latter is being updated constantly to be in parity with the Silverlight client. There’s a balance in this portal between simplicity and power – we’re following the “less is more” approach, with increasing levels of detail as you work through the portal rather than overwhelming you with a single, long “more is more” page. You can find the Portal here: http://windowsazure.com (then click “Log In” and then “Portal”) Windows Azure Management API You can also use programming tools to either write your own interface, or simply provide management functions directly within your solution. You have two options – you can use the more universal REST API’s, which area bit more complex but work with any system that can write to them, or the more approachable .NET API calls in code. You can find the reference for the API’s here: http://msdn.microsoft.com/en-us/library/windowsazure/ee460799.aspx  All Class Libraries, for each part of Windows Azure: http://msdn.microsoft.com/en-us/library/ee393295.aspx  PowerShell Command-lets PowerShell is one of the most powerful scripting languages I’ve used with Windows – and it’s baked into all of our products. When you need to work with multiple servers, scripting is really the only way to go, and the Windows Azure PowerShell Command-Lets allow you to work across most any part of the platform – and can even be used within the services themselves. You can do everything with them from creating a new IaaS, PaaS or SaaS service, to controlling them and even working with security and more. You can find more about the Command-Lets here: http://wappowershell.codeplex.com/documentation (older link, still works, will point you to the new ones as well) We have command-line utilities for other operating systems as well: https://www.windowsazure.com/en-us/manage/downloads/  Video walkthrough of using the Command-Lets: http://channel9.msdn.com/Events/BUILD/BUILD2011/SAC-859T  System Center System Center is actually a suite of graphical tools you can use to manage, deploy, control, monitor and tune software from Microsoft and even other platforms. This will be the primary tool we’ll recommend for managing a hybrid or contiguous management process – and as time goes on you’ll see more and more features put into System Center for the entire Windows Azure suite of products. You can find the Management Pack and README for it here: http://www.microsoft.com/en-us/download/details.aspx?id=11324  SQL Server Management Studio / Data Tools / Visual Studio SQL Server has two built-in management and development, and since Version 2008 R2, you can use them to manage Windows Azure Databases. Visual Studio also lets you connect to and manage portions of Windows Azure as well as Windows Azure Databases. You can read more about Visual Studio here: http://msdn.microsoft.com/en-us/library/windowsazure/ee405484  You can read more about the SQL tools here: http://msdn.microsoft.com/en-us/library/windowsazure/ee621784.aspx  Vendor-Provided Tools Microsoft does not suggest or endorse a specific third-party product. We do, however, use them, and see lots of other customers use them. You can browse to these sites to learn more, and chat with their folks directly on how they support Windows Azure. Cerebrata: Tools for managing from the command-line, graphical diagnostics, graphical storage management - http://www.cerebrata.com/  Quest Cloud Tools: Monitoring, Storage Management, and costing tools - http://communities.quest.com/community/cloud-tools  Paraleap: Monitoring tool - http://www.paraleap.com/AzureWatch  Cloudgraphs: Monitoring too -  http://www.cloudgraphs.com/  Opstera: Monitoring for Windows Azure and a Scale-out pattern manager - http://www.opstera.com/products/Azureops/  Compuware: SaaS performance monitoring, load testing -  http://www.compuware.com/application-performance-management/gomez-apm-products.html  SOASTA: Penetration and Security Testing - http://www.soasta.com/cloudtest/enterprise/  LoadStorm: Load-testing tool - http://loadstorm.com/windows-azure  Open-Source Tools This is probably the most specific set of tools, and the list I’ll have to maintain most often. Smaller projects have a way of coming and going, so I’ll try and make sure this list is current. Windows Azure MMC: (I actually use this one a lot) http://wapmmc.codeplex.com/  Windows Azure Diagnostics Monitor: http://archive.msdn.microsoft.com/wazdmon  Azure Application Monitor: http://azuremonitor.codeplex.com/  Azure Web Log: http://www.xentrik.net/software/azure_web_log.html  Cloud Ninja:Multi-Tennant billing and performance monitor -  http://cnmb.codeplex.com/  Cloud Samurai: Multi-Tennant Management- http://cloudsamurai.codeplex.com/    If you have additions to this list, please post them as a comment and I’ll research and then add them. Thanks!

    Read the article

  • Google Analytics custom variables and how are they recorded?

    - by mrtsherman
    I have been asked to add GA custom variable tracking to my company's website. The company website uses server side includes, so making modifications to the tracking code happens identically everywhere. Maintenance is therefore a headache. Also, GA takes about twenty-four hours for custom variables to start showing up in reports and that makes troubleshooting a headache. So if you have custom variables // visitor level tracking, id = 12345 _gaq.push(['_setCustomVar', 1, 'id', '12345', 1]); // page level tracking, email = [email protected] _gaq.push(['_setCustomVar', 1, 'email', '[email protected]', 1]); The marketing people want the following out of this: User visits site and we record a unique id for them. Whenever they return this id will be used in GA. User signs up for our newsletter on page X and we record their email address. Whenever they return this email address is used in GA. Now a big problem for me is that I don't use GA and the marketing people don't use custom variables. So we don't actually know how this will work. Do I want Page, Session or Visitor level tracking? What happens because the same GA code is used on every page? If they visit the email sign up form and we record the email address, but then they go somewhere else where email is nonexistent will the value get 'overwritten.' Sorry for the long question, but there are a lot of unknowns for a GA noob.

    Read the article

  • What sort of attack URL is this?

    - by Asker
    I set up a website with my own custom PHP code. It appears that people from places like Ukraine are trying to hack it. They're trying a bunch of odd accesses, seemingly to detect what PHP files I've got. They've discovered that I have PHP files called mail.php and sendmail.php, for instance. They've tried a bunch of GET options like: http://mydomain.com/index.php?do=/user/register/ http://mydomain.com/index.php?app=core&module=global§ion=login http://mydomain.com/index.php?act=Login&CODE=00 I suppose these all pertain to something like LiveJournal? Here's what's odd, and the subject of my question. They're trying this URL: http://mydomain.com?3e3ea140 What kind of website is vulnerable to a 32-bit hex number?

    Read the article

  • Ways to go about optimizing website performance WordPress, Amazon EC2 Apache and RDS MySQL

    - by fuzzybee
    I have 6 WordPress websites running on 1 single EC2 instance. All the the websites are connecting to databases in 1 same RDS instance. Earlier today, traffic to the largest website peaked and the RDS instance went bottle-neck - CPU utilization was 100% for over an hour. It affected all of my websites as it took them all forever to load. In order to prevent such issue from happening again, which of the following will matter most so that I invest time and effort in first of all? (I will work on all later, I just need to prioritise now) To improve caching for all websites To fine-tune the database server To fine-tune my Apache server What will be the effect on user experience for my websites? Some quick searches show that I should limit number of concurrent connections to my web server but wouldn't that prevent users from accessing my websites? More background: My largest website has 140k visits and 660k page views a month. The other 5 websites should add up much less than that. I'm using a large EC2 instance as the web server I'm using a medium RDS instance as the database server What I've already done: Use W3 Total Cache plugin for caching for most the websites, especially the largest one (I can barely anything else in terms of caching I could do for the largest website) Am I using my resources wastefully or is there simply not enough resources for my websites - or rather, how do I answer that question myself?

    Read the article

  • Software to automate website screenshot capture

    - by Leniel Macaferi
    Do you know any software that can automate the process of getting screenshots of every page of a website? It would act like a spider/crawler/robot. You name it... For example: I developed a website and now I'd like to get a screenshot of every page of the site. I of course could do it manually (a lot of work). For each module of the site (Student, Payment, etc) I have different pages (Create, Edit, Details, Delete, etc) forms. The thing I'm looking for is a software that can visit every link of the site and then capture the screen - a software that can automate the whole process. It would also be good if the software allowed the user to pass a list of URLs to capture screenshots allowing even more fine grained configuration. EDIT: I tried Selenium mentioned by Aaron in his answer but I managed to find an app that does exactly what I needed. It's called Paparazzi!. I wrote a blog post to showcase my attempt at Selenium and the findings regarding Paparazzi!'s batch capture functionality: Software to automate website screenshot capture

    Read the article

  • Tools required for a Web Development Project..

    - by RBA
    Hi, I wanted to design a project in linux which could contain programming languages(C, perl, PHP, HTML, XML etc) basically a web based project. Why i have chosen to build on Linux is because it is Open Source, and lot many things can be automated through scripting languages, which in windows i don't know. So, i have installed linux on a virtual machine(Host-Windows 2007 & Guest Linux CentOS), CentOS(command line interface). Since i am a beginner, so I want to know what all tools can be used to facilitate and ease my development process. Some which i know are listed below, and request you to please share your experience on this. 1) Using Putty so that can access the Linux machine from anywhere within the network. 2) Since i want to develop on Linux, but want to use windows as developing platform. So have downloaded Eclipse Editor (C/PHP) on windows. But want to know how can i access linux files from here?? 3) Installed Samba, and still trying to figure out how can i access linux files remotely on Windows. 4) Please share your experience, as how can i ease my development process. and what all tools i can use..?? Please let me know if you need any other clarification..

    Read the article

  • Is Paypal the best solution for payment gateway for a website?

    - by Pennf0lio
    I have a realty website that needs a payment gateway for their property reservation. The reservation fee range from $500-$600 and about 5-6 people per month. I was wondering if Paypal is the best solution for accepting Payment. What will be the Pros and Cons using Paypal. Paypal was my first choice because It's easy to integrate on my existing website and I wouldn't be minding so much on the security. P.S. It's not a part of the question, But If you can site some realty website that accept payment and would be a good inspiration. It would be highly appreciated. Thanks!

    Read the article

  • What data to send when tracking clicks with Google Analytics events (and how)?

    - by user359650
    When tracking clicks on links, there are 3 items I'm interested in: link location in the page by grabbing the id of the closest parent: to see influence of location on click-through link text: to see influence of text on click-through link href attribute value: to see where people go when leaving my website The problem when using Google Analytics to track those clicks is that events only have 3 available text fields, one of which being the category, which if you use to store one of the above items will create a mess in your Event reporting because you will have as many categories as item values. Therefore if you assign a predefined value to the category (e.g. clicks), then you're left with only 2 event fields (action, label) to store 3 items (location, text, href). That in itself isn't the end of the world because you can concatenate 2 items into 1 event field, then use the reporting or the API to filter things out. Accordingly what I plan on doing is this: category: clicks action: {location_on_page} ¦ {text} label: {href} where {__} are variable values related to the clicked links With this I can easily create some reports directly via the GUI: downloads: include only events where label ends with .pdf click outs to particular domains: include only events where label contains domain And for more complex tasks I need to export the data (or use the API): influence of location on clicks: for each location in the design, count number of events that have that location in the action, then corroborate with pageviews of the corresponding pages. Whilst this looks good I'm wondering if there is a better approach, hence the following questions: Q1: Can you foresee any particular issues with this particular setup (e.g. things I won't be able to report on)? Q2: Can you think of other data that would be interesting to include in the event?

    Read the article

  • Troubleshooting Website problems within the local network

    - by HaydnWVN
    Have an external website which opens fine on some PC's, yet seems to time out (or symptoms of timing out, but never actually does) on others. Seems to only affect (some) of our newer HP Pro 3305 MT Workstations. All of which are running Win7 32bit SP1 with all updates. Older PC's (Win7 32bit SP1 & WinXP) are unaffected. Using Google Chrome & Firefox makes no difference. Opening the website in IE9 Compatibility Mode has exactly the same symptoms. All PC's are on the same local network (Workgroup) using the same DNS server & gateway (inhouse) on the same internet connection, on the same subnet. There is no proxy server, no content filtering, no load balancing etc etc. Only group policy in effect (locally) is for Update scheduling. Local firewalls are all the same (Kaspersky WP4) and our external facing firewall has no IP specific settings. I have no control over the external website, traceroute shows the same destination on all PC's. It is a fairly popular website in our industry (Horticulture) and i'm not aware of any other people (even other sites within our sister companies) with the same problem. Update: Used Fiddler2 to monitor the HTTP request, seems its not getting fulfilled for some reason?! Request sent: GET http://www.rhs.org.uk/ HTTP/1.1 Host: www.rhs.org.uk Connection: keep-alive User-Agent: Mozilla/5.0 (Windows NT 6.1) AppleWebKit/536.11 (KHTML, like Gecko) Chrome/20.0.1132.47 Safari/536.11 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-GB,en-US;q=0.8,en;q=0.6 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 Log from Fiddler 2 of the request: This session is not yet complete. Press F5 to refresh when session is complete for updated statistics. Request Count: 1 Bytes Sent: 567 (headers:567; body:0) Bytes Received: 0 (headers:0; body:0) ACTUAL PERFORMANCE -------------- ClientConnected: 17:02:33.720 ClientBeginRequest: 17:02:39.118 GotRequestHeaders: 17:02:39.118 ClientDoneRequest: 17:02:39.118 Determine Gateway: 0ms DNS Lookup: 0ms TCP/IP Connect: 46ms HTTPS Handshake: 0ms ServerConnected: 17:02:39.165 FiddlerBeginRequest: 17:02:39.165 ServerGotRequest: 17:02:39.165 ServerBeginResponse: 00:00:00.000 GotResponseHeaders: 00:00:00.000 ServerDoneResponse: 00:00:00.000 ClientBeginResponse: 00:00:00.000 ClientDoneResponse: 00:00:00.000 RESPONSE BYTES (by Content-Type) -------------- ~headers~: 0 Log of a successful request from a working PC (done this morning, excuse the timestamps being different from above): Request Count: 1 Bytes Sent: 493 (headers:493; body:0) Bytes Received: 20,413 (headers:525; body:19,888) ACTUAL PERFORMANCE -------------- ClientConnected: 08:22:47.766 ClientBeginRequest: 08:22:47.766 GotRequestHeaders: 08:22:47.766 ClientDoneRequest: 08:22:47.766 Determine Gateway: 0ms DNS Lookup: 26ms TCP/IP Connect: 30ms HTTPS Handshake: 0ms ServerConnected: 08:22:47.828 FiddlerBeginRequest: 08:22:47.828 ServerGotRequest: 08:22:47.828 ServerBeginResponse: 08:22:48.905 GotResponseHeaders: 08:22:48.905 ServerDoneResponse: 08:22:48.905 ClientBeginResponse: 08:22:48.905 ClientDoneResponse: 08:22:48.905 Overall Elapsed: 00:00:01.1388020 RESPONSE BYTES (by Content-Type) -------------- text/html: 19,888 ~headers~: 525 So my question has evolved into: What is the difference between the 2 requests and how do I determine why 1 PC is not getting a reply to it's GET request?

    Read the article

  • What is the typical example of old school website design ?

    - by Pierre 303
    I want to build a website for a retro thing that was popular in the mid 90s (beginning of the commercial internet). So I want use old designs that was very popular at that time. The first thing that comes to my mind was those "under construction" animated gifs. People often put animated gifs everywhere. But also those awful repeating backgrounds. So yes, I want my website to look exactly like in the mid nineties ;) (please suggest practical and usable features, I guess an Java Applet menu would not work today, or saying on the bottom that this website is optimized for Netscape 3) EDIT: for those that wants to see the result: Retrology

    Read the article

  • What is the typical example of old school website design?

    - by Pierre 303
    I want to build a website for a retro thing that was popular in the mid 90s (beginning of the commercial internet). So I want use old designs that was very popular at that time. The first thing that comes to my mind was those "under construction" animated gifs. People often put animated gifs everywhere. But also those awful repeating backgrounds. So yes, I want my website to look exactly like in the mid nineties ;) (please suggest practical and usable features, I guess an Java Applet menu would not work today, or saying on the bottom that this website is optimized for Netscape 3) EDIT: for those that wants to see the result: Retrology

    Read the article

  • Does Google Analytics exclude Campaign traffic from Facebook in the Social reports?

    - by user1612223
    For a while we have used campaign tags when putting posts on Facebook so that we can run campaign reports in Google analytics on those links. However it appears that traffic from those links are being excluded in Google's Social reports. For example between 7/20 and 8/19 I'm seeing 123 Visits where Facebook is the source in my Campaigns report, but only 29 Visits where Facebook is the source in my Social Sources report. Main questions: Does Google exclude campaign traffic from it's social reports? If it does, is there any way to reconcile that so that the traffic shows up in both reports? If it doesn't, what could be causing the vast discrepancy? One observer noted that we are setting the Medium to "Post" when passing the campaign parameters, and that Google may only allow "Referral" traffic in it's social reports (Just speculation). In that case we could potentially change the Medium to "Referral", but that would undermine some of our strategy in being able to set different mediums. I have also considered that maybe the campaign traffic came to the site several times, and the social report may count the same user as less visits, however over 70% of the Facebook campaign traffic is new traffic, so at a minimum there would need to be over 85 Visits on the Social side for that argument to be valid. I've done several searches for any information on this topic, and haven't run across much of anything. I did post the same question on Google's Product Forum and have not gotten a response. The title of that question was 'Facebook Campaign Traffic Not Showing in Social Reports'. The inability to pass campaign data on Facebook posts would make evaluating the performance of those specific posts very difficult, so I'm hoping there is a solution to this.

    Read the article

  • With Google Analytics, is it possible to check a specific page in Multi-Channel conversion attribution?

    - by Emmett R.
    I'm somewhat new to Google Analytics, and I'm trying to track all conversions that are assisted by a particular landing page, because I don't expect an instant purchase. I have e-commerce tracking set up. Due to the constraints of the associated ad campaign, I can't include the source/medium code in the url when people go to the landing page, and all of my traffic to the landing page is likely to be direct, so I'm not sure how to tell Multi-Channel marketing that it's a significant page. I know how to add events to a page, but I'm still figuring out what they can and cannot do. Would creating a redirect from the landing url to an identical url+source/medium code work? Any advice on how to accomplish this would be greatly appreciated. Tracking the final sale conversion is not the issue. Ecommerce reporting is functioning just fine on the site. I just want to report the landing page as an assist, whenever it shows up in the funnel, and I need to be able to do that across multiple visits.

    Read the article

  • Google Analytics: Do unique events report as unique visits when triggered on pages other than your own domain?

    - by Jesse Gardner
    We just recently attached a SWF to our Brightcove video player to report various events back to Google Analytics. We're also tracking page views with a standard GA snippet on the page where the player is embedded. As I understand it, because a unique has already been recorded for the page, any event being triggered by the player is getting associated with that unique. However, we allow people to embed the video player on other websites. All of the event data started pouring into the Events section as expected, but we noticed a dramatic uptick in unique visitors on the site (nearly double) while the pageview count stayed relatively unchanged. Disabling event tracking brought the traffic back down to average levels. I should also add that in the Pages section of Event tracking we're seeing URLs for other sites where the player has been embedded; but this data isn't showing up in the Content section. It seems counterintuitive, but does GA count an event fired as a unique visit even if it's triggered from some place other than your website? Is so, there any way to trigger an event in the events section without it reporting to the unique visitor count?

    Read the article

  • Google Analytics Not tracking data correctly IP-address issue?

    - by PaperThick
    I have developed a small site for a client and the site has been placed inside a <iframe> at the clients site. The GA-script I'm using looks like this: <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push( ['_setAccount', 'UA-XXXXXXXX-2'], //My company's GA-account ['_trackPageview'], ['b._setAccount', 'UA-XXXXXXXX-1'], // Test GA-account ['b._trackPageview'], ['th._setAccount', 'UA-XXXXXXX-3'], ['th._setDomainName', '.clientdomain.se'], // Client GA-account ['th._trackPageview'] ); (function () { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> </head> As you can see I report the GA pageviews to the client as well. The GA script is tracking visitors and pageviews at both ends. But the problem is that on my clients side the visitor-count is more than double what they are on my end (20 000 vs 5 000). At first I thought that it was being duplicated at some point but when I checked my Crazy-Egg account I saw that it had tracked over 10 000 visits and then stopped tracking because that was my limit on the account. The page my site is on is on a IP-address (http://XXX.XXX.XX.X/campaign/) and not on a "valid url". Could that be an issue why some of the visitors isn't beeing tracked? Thanks in advance

    Read the article

< Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >