Search Results

Search found 2725 results on 109 pages for 'crm analytics'.

Page 18/109 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >

  • Where does one enter the JavaScript code in CRM Dynamics?

    - by Konrad Viltersten
    I've started to play with CRM Dynamics yesterday so this question should be seen as a very basic one. I've been coding for many years but CRM D is news to me. Apparently, one is supposed to be able to enter JavaScript code to customize the behavior of the application. I've understood that there's an API for that and that touching DOM directly or playing with jQuery is a no-no. Question: Where is the JS-code supposed to be entered? I've gone through all the menus but as far I can see, there's no spot where I could plug-in my custom code. E.g.: Where do I get to define a validation for the last name of a contact currently being defined?

    Read the article

  • Does placing Google Analytics code in an external file affect statistics?

    - by Jacob Hume
    I'm working with an outside software vendor to add Google Analytics code to their web app, so that we can track its usage. Their developer suggested that we place the code in an external ".js" file, and he could include that in the layout of his application. The StackOverflow question "Google Analytics: External .js file covers the technical aspect, so apparently tracking is possible via an external file. However, I'm not quite satisfied that this won't have negative implications. Does including the tracking code as an external file affect the statistics collected by Google?

    Read the article

  • Getting live traffic/visitor analytics when using a reverse proxy

    - by jotto
    I'm in process of implementing Varnish as a reverse proxy for a Ruby on Rails app and I'm using Google Analytics (JS/client side script to record visitor data) but it's several hours delayed so its useless for knowing what's going on now. I need at a glance live data that includes referring traffic and what current req/sec is. Right now I am using a simple Rack middleware application to do the live stats (gist.github.com/235745) but if the majority of traffic hits Varnish, Rack will never be hit so this won't work. The closest solution I've found so far is http://www.reinvigorate.net/ but it's in beta (there are also no implementation details on their front page). Does Varnish have traffic logs that I can custom format to match my Apache logs so I can combine them, or will I have to roll my own JS implementation like GA that shows the data in real time?

    Read the article

  • SharePoint Web Analytics not tracking usage for main application

    - by Chris W
    My SP 2010 setup is two separate applications - one for the main portal and one for MySite. Whilst WebAnalytics is tracking usage of MySite it's not showing any stats for the main Portal. The only thing it lists is the number of site collections but no page views etc. The WA service is clearly running to pick up data for MySite. In Configure web analytics and health data collection everything is ticked. I can't find any obvious settings that are different between the two applications. Where should I look to get usage tracking correctly?

    Read the article

  • Keep Google Analytics in a backup site or not?

    - by Yannis Dran
    I backed up my website and uploaded it to another server for testing and backup purposes. Should I remove the Google Analytics snippet from the index.php (which is for the real site), or does it not matter as it's not the same server and url address as the one declared at Google Analytics account? The reason I don't want to remove it is in case someone forgets about it if they upload the backup to the real site in case the real one breaks. Also I know that if I turn off the website there is no GA snippet, but I need it open so I can easily access and test it so I don't have to write pass all the time.

    Read the article

  • Is there a modern (eg NoSQL) web analytics solution based on log files?

    - by Martin
    I have been using Awstats for many years to process my log files. But I am missing many possibilities (like cross-domain reports) and I hate being stuck with extra fields I created years ago. Anyway, I am not going to continue to use this script. Is there a modern apache logs analytics solution based on modern storage technologies like NoSQL or at least somehow ready to cope with large datasets efficiently? I am primarily looking for something that generates nice sortable and searchable outputs with the focus on web analytics, before having to write my own frontends. (so graylog2 is not an option) This question is purely about log file based solutions.

    Read the article

  • Is it possible to filter analytics to particular visits like you can filter to particular dates?

    - by andy
    Is it possible to find out more information about particular visits in analytics? For example, say I'm looking at new versus returning users. I then add a secondary column of "city". Ok, now I know all new users from yesterday came from new york, for example. But what if I want to find out more information about those particular new vists from new york. Such as behaviors, technology, content. Is it possible to filter analytics to particular visits like you can filter to particular dates?

    Read the article

  • Google Analytics unexplained spike

    - by Dianne
    My client's Google Analytics has had a spike everyday from May 6th (from 0 - 100.) This is in a city that he is not optimized for and does very little business in. The hits are coming in direct to the website. My client is concerned that it has something to do with competition using his site as a price shopping device. I can't view the ip to see where they are coming from and his site is not built in PHP so the work around doesn't work here. Any thoughts? Could it be a "referring site" situation and if so is there a way for me to find out what the referring site is?

    Read the article

  • Dashboard to aggregate Google Analytics, Facebook, YouTube etc tracking data?

    - by Richard
    I'd like to see as much tracking data as possible about my online presence, in one single dashboard - so views/conversions from Google Analytics data, the performance of my Facebook campaigns via the Insights API, views/clicks from my YouTube campaigns, etc. This could be as simple as a graph with time on the x-axis, and key indicators from each source on the y-axis (conversions from Analytics, likes on Facebook, views on YouTube, etc). The idea is that I can see customer engagement with each source, over time. I can write my own such dashboard easily enough, but I wondered if there was something off-the-shelf that already did this. Apologies if this isn't the right forum for such a question - would appreciate tips for the best place to ask.

    Read the article

  • In Google Analytics, how can I determine the value of a page if no goals or revenue have been determined?

    - by Brandon Durham
    I have 4 years of data in Analytics with over 20 million pageviews for the entire site. No goals have ever been set up, and while the site is an ecommerce site, no ecommerce features in Google Analytics have ever been taken advantage of. So I have no way to determine what the actual value of a page is. I've been tasked with determining if a particular page on the site is worth keeping around. How might I use all standard data (pageviews, bounce rate, time on page, time on site, etc.) to help determine the value of this page? I really appreciate any help I can get!

    Read the article

  • In Google Analytics, how can I determine the value of a page if no goals or revenue have been determined?

    - by Brandon Durham
    I have 4 years of data in Analytics with over 20 million pageviews for the entire site. No goals have ever been set up, and while the site is an ecommerce site, no ecommerce features in Google Analytics have ever been taken advantage of. So I have no way to determine what the actual value of a page is. I've been tasked with determining if a particular page on the site is worth keeping around. How might I use all standard data (pageviews, bounce rate, time on page, time on site, etc.) to help determine the value of this page? I really appreciate any help I can get!

    Read the article

  • Tracking logged in vs. non-logged in users in Google Analytics

    - by Justin
    I am building a social media site that is similar is structure to twitter and facebook.com where unauthenticated users who go to https://mysite.com will see a login + sign-up page, and authenticated users who go to https://mysite.com will see their timeline. My question is, what is the best practice (using Google Analytics) for tracking these two different types of users who are viewing completely different content but are visiting the same URL. I tried searching the Google Analytics docs but couldn't find what they suggested for this scenario. Perhaps I just don't know what keywords to search for. Thanks in advance for any help.

    Read the article

  • Is Google Analytics safe for websites that deal with sensitive information?

    - by guanome
    I work for a company that writes several webapps that deal with a lot of sensitive information, such as full name, date of birth, address, and SSNs. Currently we don't have anything to measure site usage, but I would like to use Google Analytics to track usage and statistics about our users. What data is sent to Google when you use Analytics? If I put this on a page that contains any of the above information, will that data be sent to Google? Or are they just getting the necessary information like user agent and IP address?

    Read the article

  • Are my Google Analytics ( 2 domains 1 site) duplicated or unique?

    - by MarcDJay
    We have recently built a new website with a new domain to replace an old website, and on the advice of our IT guys and web dev team have pointed both oldaddress.com's & newaddress.com's a records to the new website. Now, they both share the same google analytics code (UA-12345-1) and as such we have two entries in the Google Analytics dashboard. The problem is I'm still fairly novice with GA and as the reports seem VERY similar (~25k pageviews for each domain), are these figures exclusively for that domain? For example: oldaddress.com 25,400 pageviews newaddress.com 25,600 pageviews Does this mean that in total for this website I have 51,000 pageviews. Hope this is clear enough but let me know if anything needs clarifying. Thanks.

    Read the article

  • CRM 2011 XAML Workflows in VS 2012

    - by AlexR
    I'm writing this knowing there's another topic like this to be found here the answer provided however does not make much sense to me and whatever I tried doesn't seem to work so I'm seeking clarification. The situation is as follows: * New CRM 2011 toolkit solution * New project added for Xaml workflow * Adding CRM "Workflow" activity causes error The error: It shows a redbox "Could not generate view for Workflow" The Exception: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. --- System.IO.IOException: Cannot locate resource 'workflowdesigner.xaml' I have VS 2012 SP2 with the latest (June) CRM SDK Toolkit installed. I did a clean install of the toolkit so it's not "upgraded" to prevent any old assemblies being referenced. I tried the things from the referenced question and moved the assemblies over and also re-referenced the latest assemblies to make sure everything works fine. I also tried downloading a workflow XAML created in CRM and opening but it gives the same error. I also opened the "example" workflow XAML solution in the SDK, again same result.

    Read the article

  • Implementing a two-way communication between Microsoft Dynamics CRM and 3rd party app

    - by CxDoo
    I need to implement a bi-directional communication between Microsoft Dynamics CRM and a 3rd party server. The ideal scenario is as follows: User tries to create an entity in CRM In pre-create hook a 3rd party library function is called (or web service or whatever), filled with relevant info, which tries to create the respective entity on the server If the call fails, creation fails in CRM If the call succeeds, the entity is created in the CRM AND additional fields are filled with return values from the call More specifically, I want to do something like this when user tries to create a new entity instance: try { ExternalWebService.CreateTrade(ref TradeInfo info) //this was initialized on the external server myCRM_Trade_Entity.SerialNo = info.SerialNo; CreateNew(myCRM_Trade_Entity); } catch (whatever) { fail; } What would be the suggested way to do this? I am new to Dynamics, have read about Workflows and Plugins but am not sure how should I do this properly.

    Read the article

  • Banco Espírito Santo Increases Sales Campaign Success Rate with Siebel CRM

    - by Tony Berk
    Banco Espírito Santo (BES), founded in 1869, is the second-largest private financial institution in Portugal with a 20.3% domestic market share, 2.1 million customers, and more than 700 in-country branches. It also has a strong international presence with operations in 23 countries and four continents. With strong growth in its major markets, BES needed a modern, cost-effective, scalable, and reliable customer relationship management (CRM) solution for its retail operations. The bank wanted to optimize client relationship management and integrate all customer touch points and service channels to improve the success of its sales and marketing initiatives. BES implemented the same CRM solution as many other leading banks: Oracle's Siebel CRM. With Siebel CRM 8.1 and other Oracle solutions, BES significantly increased sales of its new financial products across all channels by up to 25%, and it expects to increase annual revenue by up US$4 million annually. It also improved the success rate of bank branch sales, marketing, and lead generation campaigns by nearly 10%. “We are very happy with Oracle’s Siebel CRM applications. We already knew that this was the best solution available, but it has surpassed our best expectations,” said João Manaças, Customer Relationship Management Manager, Personal Marketing Department, Banco Espírito Santo. Click here to learn more about BES's use of Siebel CRM.

    Read the article

  • Site Web Analytics not updating Sharepoint 2010

    - by Rohit Gupta
    If you facing the issue that the web Analytics Reports in SharePoint 2010 Central Administration is not updating data. When you go to your site > site settings > Site Web Analytics reports or Site Collection Analytics reports  You get old data as in the ribbon displayed "Data Last Updated: 12/13/2010 2:00:20 AM" Please insure that the following things are covered: Insure that Usage and Data Health Data Collection service is configured correctly. Log Collection Schedule is configured correctly Microsoft Sharepoint Foundation Usage Data Import and Microsoft SharePoint Foundation Usage Data Processing Timer jobs are configured to run at regular intervals One last important Timer job is the Web Analytics Trigger Workflows Timer Job insure that this timer job is enabled and scheduled to run at regular intervals (for each site that you need analytics for). After you have insured that the web analytics service configuration is working fine and the Usage Data Import job is importing the *.usage files from the ULS LOGS folder into the WSS_Logging database, and that all the required timer jobs are running as expected… wait for a day for the report to get updated… the report gets updated automatically at 2:00 am in the morning… and i could not find a way to control the schedule for this report update job. So be sure to wait for a day before giving up :)

    Read the article

  • Why The Athene Group Chose Fusion CRM

    - by Tony Berk
    A guest post by Vikas Bhambri, Managing Partner, The Athene Group This year, The Athene Group (www.theathenegroup.com) celebrated our tenth anniversary. The company has accomplished a lot in ten years overcoming a number of hurdles and challenges to have grown organically to a 150+ person global company with offices in the US, UK, and India and customers in the US, Canada, and Europe. Now more than ever with the current global landscape from an economic and competitive standpoint it was vital that we make some changes to remain successful for the next ten years. There were two key initiatives that we discussed internally that would enable us to successfully accomplish this – collaboration and the concept of “insight to action”. With our existing Oracle CRM On Demand platform we had components of this but not the full depth and breadth that we were looking for. When we started to discuss Fusion CRM we immediately saw several next generation tools that would embrace these two objectives. For a consulting and development organization the collaboration required between business development and consulting delivery is as important as the collaboration required during the projects between the project delivery and account management teams. The Activity Streams functionality in Fusion CRM immediately addressed the communication of key discussion topics and exchanges around our clients. Of course when we saw the Oracle Social Network (which is part of our Fusion CRM roadmap) we were blown away. The combination OSN and our CRM is going to make us more effective as we discuss and work cohesively on client engagements – ensuring mutual success for both Athene and our clients. When we looked at “insight to action” we saw that we had a great platform when folks were at their desks, unfortunately a lot of our business development and consulting folks are on the road. The Fusion Mobile Sales and Fusion Outlook Desktop provide information to our teams when they are on the go. So that they can provide real-time information and react to real-time information provided by their peers. We are in the early stages of our transformative experience with Fusion CRM but we believe the platform along with our people and processes are going to help us achieve our goals in the future.

    Read the article

  • Successful Fusion CRM Bootcamp in Paris - July 24-24th

    - by Richard Lefebvre
    The first Fusion CRM Bootcamp for EMEA partners successfully took place in the Paris Pullmann Bercy hotel on July 24-26th. The agenda covered 14 Fusion CRM topics in depth, including detailed presentations and hands-on exercises, delivered by a team of Fusion CRM experts from Oracle Product Development. 89 participants represented 55 companies from 14 different countries, attended this event which was also a great opportunity to network with Oracle Product Development and Alliances & Channels executives during the breaks and the "Fusion Lounge" session each day after the training. As expressed by the participants in the event survey, the overall satisfaction reached to an impressive percentage of 85+ with the response of “met or exceeded the expectations” and with individual comments such as: On top of the presentation of Fusion CRM as a product, this event allowed to better understand Oracle's product and rollout strategy. The ability to meet the development team was really a bonus. Extremely valuable information given that enables integrators to go on the road of Fusion CRM Excellent organization, good product information coverage and demonstration Additional Fusion CRM bootcamps are planed across EMEA in the next quarters, although they will probably be under a different format which is still to be defined.

    Read the article

  • Database types for customer analytics

    - by Drewdavid
    I am exploring a paid solution to start providing better embedded, dashboard-style analytics information to our website customers/account holders, but would like to also offer an in-house development option to our team. The more equipped I am with specifics (such as the subject of this question), the better the adoption rate from the team (or so I have found), regardless of the path we choose Would anyone care to summarize a couple of options for a fast and scalable database type through which we would provide the following: • Daily pageviews to a users account pages (users have between 1 and 1000 pages) • Some calculated/compounded metrics (such as conversion rate, i.e. certain page type viewed to contact form thank you page ratio) • We have about 1,500 members (will need room to grow); the number of concurrently logged in users will for the question's sake be 50 I ask because our developer has balked at providing this level of "over time" granularity (i.e. daily) due to the number of space it would take up in a MYSQL database To avoid a downvote I have asked specifically for more than one option, realizing that different people will have different solutions. I will make amendments to my question if so guided by answering parties Thank you for sharing your valued answers :)

    Read the article

  • SharePoint Web Analytics not tracking usage for main application

    - by Chris W
    My SP 2010 setup is two separate applications - one for the main portal and one for MySite. Whilst WebAnalytics is tracking usage of MySite it's not showing any stats for the main Portal. The only thing it lists is the number of site collections but no page views etc. The WA service is clearly running to pick up data for MySite. In Configure web analytics and health data collection everything is ticked. I can't find any obvious settings that are different between the two applications. Where should I look to get usage tracking correctly? Edit: Having played with the date ranges I see that actually I've got no stats in the last 7 days for any site at all including MySite which has been working at some point previously. Edit: What does each service (WA Data Processing Service vs WA Web Services) do and where should they be active? At present they're both running on an App server but not on the WFEs (although they were running on WFEs previously). From what I can gather than only need to run on an App server but I find it strange that the only logged activity I see in the staging database relates to Central Admin URLs on the App server and nothing from the WFEs.

    Read the article

  • Convert Google Analytics cookies to Local/Session Storage

    - by David Murdoch
    Google Analytics sets 4 cookies that will be sent with all requests to that domain (and ofset its subdomains). From what I can tell no server actually uses them directly; they're only sent with __utm.gif as a query param. Now, obviously Google Analytics reads, writes and acts on their values and they will need to be available to the GA tracking script. So, what I am wondering is if it is possible to: rewrite the __utm* cookies to local storage after ga.js has written them delete them after ga.js has run rewrite the cookies FROM local storage back to cookie form right before ga.js reads them start over Or, monkey patch ga.js to use local storage before it begins the cookie read/write part. Obviously if we are going so far out of the way to remove the __utm* cookies we'll want to also use the Async variant of Analytics. I'm guessing the down vote was because I didn't ask a question. DOH! My questions are: Can it be done as described above? If so, why hasn't it been done? I have a default HTML/CSS/JS boilerplate template that passes YSlow, PageSpeed, and Chrome's Audit with near perfect scores. I'm really looking for a way to squeeze those remaining cookie bytes from Google Analytics in browsers that support local storage.

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >