Search Results

Search found 56527 results on 2262 pages for 'jacqueline coolidge(at)oracle com'.

Page 168/2262 | < Previous Page | 164 165 166 167 168 169 170 171 172 173 174 175  | Next Page >

  • Comparing the Performance of Visual Studio&apos;s Web Reference to a Custom Class

    As developers, we all make assumptions when programming. Perhaps the biggest assumption we make is that those libraries and tools that ship with the .NET Framework are the best way to accomplish a given task. For example, most developers assume that using <a href="http://www.4guysfromrolla.com/articles/120705-1.aspx">ASP.NET's Membership system</a> is the best way to manage user accounts in a website (rather than rolling your own user account store). Similarly, creating a Web Reference to communicate with a <a href="http://www.4guysfromrolla.com/articles/100803-1.aspx">web service</a> generates markup that auto-creates a <i>proxy class</i>, which handles the low-level details of invoking the web service, serializing parameters,

    Read the article

  • Sony arrêtera la commercialisation des disquettes 3.5 pouces au Japan, mais continuera sa commercial

    Sony arrêtera la commercialisation des disquettes 3.5 pouces au japan, mais continuera sa commercialisation en inde Sony a annoncé vouloir arrêter la commercialisation au japon des fameuses disquettes 3.5 pouces d'ici mars 2011. [IMG]http://djug.developpez.com/rsc/floppy.jpg[/IMG] Sony qui a commercialisé ce support de stockage pour la première fois en 1983, a pu vendre plus de 47 millions d'unité en 2000. Ne vous étonnez pas, ce support de stockage est encore utilisé ; Sony à commercialisé en 2009 plus 8.5 millions d'unités et cela uniquement au Japon, un chiffre intéressent vu la taille se stockage offerte par ces disquettes et qui ne dépasse pas 1.44 Mo. Cet arrêt de la com...

    Read the article

  • Why is Google PageRank not showing after redirecting www to non www?

    - by muhammad usman
    I have a fashion website. I had redirected my domain http:// (non-www) to http://www domain and my preferred domain in Google Webmaster Tools was http://www. Now I have redirected http://www to http:// domain and have changed my prefered domain as well. Now Google PageRank is not showing for even a single page. Would any body please help me and let me know if I have done something wrong? Below is my .htaccess redirect code: RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] RewriteCond %{HTTP_HOST} ^www\.deemasfashion\.com$ RewriteRule ^deemasfashion\.com/?(.*)$ http://deemasfashion.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.html\ HTTP/ RewriteRule ^index\.html$ http://deemasfashion.com/ [R=301,L] RewriteRule ^index\.htm$ http://deemasfashion.com/ [R=301,L]

    Read the article

  • Azure Futures - Distributed Computing and Number Crunching

    - by JoshReuben
    "the biggest Azure customers today are the ones using HPC on-premises at the current time" - http://www.zdnet.com/blog/microsoft/windows-azure-futures-turning-the-cloud-into-a-supercomputer/8592?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+zdnet%2Fmicrosoft+%28ZDNet+All+About+Microsoft%29&utm_content=Google+Reader   Orleans Framework for cloud computing - http://research.microsoft.com/en-us/projects/orleans     HPC on Azure - http://www.zdnet.com/blog/microsoft/microsoft-finalizes-its-latest-supercomputing-operating-system-release/7414   Dryad is Microsoft’s competitor to Google MapReduce and Apache Hadoop  - http://www.zdnet.com/blog/microsoft/microsoft-takes-a-step-toward-commercializing-its-dryad-distributed-computing-technologies/8255?tag=mantle_skin;content   SQL Server Analysis Services DataMining in the cloud - http://www.sqlmag.com/article/reporting2/azure-data-mining-in-the-cloud.aspx

    Read the article

  • Satellite website or redirect

    - by Ben
    We're running a campaign for specific industries within our target market. Our main web site has a page for each industry. We also own domains for each industry i.e: FoodWidgets.com, ElectricalWidgets.com, ChemicalWidgets.com. Of the following methods, which is likely to make the best SEO improvements: Just link each domain to the main web site Forward each domain to the relevant page on the main site e.g. FoodWidgets.com (302) redirects to http://www.MainSite.com/industries/food Create a single page "satellite" web site for each domain with the same content as the industry page on the main site.

    Read the article

  • Hide folder names or such?

    - by Miller
    Okay, I have a cpanel account with unmetered everything (pay a bit per month), so I wanna host my forum on it etc I have the domains as lets say money.com wordpress.com forum.com As I'll have to put everything in different folders for instance money will be in /m/ wordpress /w/ and forum in /forum/ or something. What I'm saying is, how do I hide the file so it'll look like money.com/m/ is actually money.com ?? I need to hide the folder name the contents are in so I can host multiple sites, therefore the site will look like its the only site on the host so I don't have to add a redirect for it to direct it to the folder? Thanks guys, been trying for a while!

    Read the article

  • How to allow Google Images search to by pass hotlink protection?

    - by Marco Demaio
    I saw Google Images seems to index my images only if hotlink protection is off. * I use anyway hotlink protection because I don't like the idea of people sucking my bandwidth, i simply this code to protcet my sites from being hotlinked: RewriteEngine on RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?mydomain\.com/.*$ [NC] RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?mydomain\.com$ [NC] RewriteRule .*\.(jpg|jpeg|png|gif)$ - [F,NC,L] But in order to allow Google Image search to bypass my hotlink protection (I want Google Images search to show my images) would it suffice to add a line like this one: RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?google\.com/.*$ [NC] RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?google\.com$ [NC] Because I'm wondring: is the crawler crawling just from google.com? and what about google.it / google.co.uk, etc.? FYI: on Google official guidelines I did not find info about this. I suppose hotlink protection prevents Google Images to show images in its results because I did some tests and it seems hotlink protection does prevent my images to be shown in Google Images search.

    Read the article

  • cannot send mail to postfix /w iptables linux proxy

    - by Juzzam
    I have two separate servers, both running Ubuntu 8.04. Server 1 has the real domain name of our site, let's refer to it as example.com. Server 2 is a mail server I have setup with postfix/courier. The hostname for this server is mail.example.com. I've setup iptables on Server 1 to forward all traffic on port 25 to Server 2. I used this script (except I changed the target ip address and the port from 80 to 25). When I send an email to [email protected].com it works. However, when I try to send an email to user@example.com from gmail, I get this error: 550 550 #5.1.0 Address rejected user@example.com (state 14) /var/log/mail.log shows no new lines when this happens. What is strange is that it works with telnet from my local machine. For example: $ telnet example.com 25 220 VO13421.localdomain SMTP Postfix EHLO example.com 250-VO13421.localdomain 250-PIPELINING 250-SIZE 10240000 250-ETRN 250-STARTTLS 250-ENHANCEDSTATUSCODES 250-8BITMIME 250 DSN MAIL FROM: me@gmail.com 250 2.1.0 Ok RCPT TO: user@example.com 250 2.1.5 Ok data 354 Please start mail input. hello user... how have you been? . 250 Mail queued for delivery. quit 221 Closing connection. Good bye. /var/log/mail.log shows success (and the email goes to the maildr): Feb 24 09:47:36 VO13421 postfix/smtpd[2212]: connect from 81.208.68.208.static.dnsptr.net[208.68.xxx.xxx] Feb 24 09:48:01 VO13421 postfix/smtpd[2212]: warning: restriction `smtpd_data_restrictions' after `permit' is ignored Feb 24 09:48:01 VO13421 postfix/smtpd[2212]: 65C68120321: client=81.208.68.208.static.dnsptr.net[208.68.xxx.xxx] Feb 24 09:48:29 VO13421 postfix/smtpd[2212]: warning: restriction `smtpd_data_restrictions' after `permit' is ignored Feb 24 09:48:29 VO13421 postfix/smtpd[2212]: 6BDFA120321: client=81.208.68.208.static.dnsptr.net[208.68.xxx.xxx] Feb 24 09:48:29 VO13421 postfix/cleanup[2216]: 6BDFA120321: message-id= Feb 24 09:48:29 VO13421 postfix/qmgr[2042]: 6BDFA120321: from=, size=395, nrcpt=1 (queue active) Feb 24 09:48:29 VO13421 postfix/virtual[2217]: 6BDFA120321: to=, relay=virtual, delay=0.28, delays=0.25/0.02/0/0.01, dsn=2.0.0, status=sent (delivered to maildir) Feb 24 09:48:29 VO13421 postfix/qmgr[2042]: 6BDFA120321: removed Feb 24 09:48:30 VO13421 postfix/smtpd[2212]: disconnect from 81.208.68.208.static.dnsptr.net[208.68.xxx.xxx] iptables -L -n -v --line on example.com yields the following. Anyone know an iptables command to see the port forwarding? Also, it seems to accept all traffic, that's probably bad right? ;] num pkts bytes target prot opt in out source destination 1 14041 1023K ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) num pkts bytes target prot opt in out source destination 1 338 20722 ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 Chain OUTPUT (policy ACCEPT 419K packets, 425M bytes) num pkts bytes target prot opt in out source destination 1 13711 2824K ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 postconf -n results in: alias_database = hash:/etc/postfix/aliases alias_maps = hash:/etc/postfix/aliases append_dot_mydomain = no biff = no config_directory = /etc/postfix delay_warning_time = 4h disable_vrfy_command = yes inet_interfaces = all local_recipient_maps = mailbox_size_limit = 0 masquerade_domains = mail.example.com mail1.example.com masquerade_exceptions = root maximal_backoff_time = 8000s maximal_queue_lifetime = 7d minimal_backoff_time = 1000s mydestination = mynetworks = 127.0.0.0/8 [::ffff:127.0.0.0]/104 [::1]/128 mynetworks_style = host myorigin = example.com readme_directory = no recipient_delimiter = + relayhost = smtp_helo_timeout = 60s smtp_tls_session_cache_database = btree:${data_directory}/smtp_scache smtpd_banner = $myhostname SMTP $mail_name smtpd_client_restrictions = reject_rbl_client sbl.spamhaus.org, reject_rbl_client blackholes.easynet.nl, reject_rbl_client dnsbl.njabl.org smtpd_delay_reject = yes smtpd_hard_error_limit = 12 smtpd_helo_required = yes smtpd_helo_restrictions = permit_mynetworks, warn_if_reject reject_non_fqdn_hostname, reject_invalid_hostname, permit smtpd_recipient_limit = 16 smtpd_recipient_restrictions = reject_unauth_pipelining, permit_mynetworks, reject_non_fqdn_recipient, reject_unknown_recipient_domain, reject_unauth_destination, permit smtpd_data_restrictions = reject_unauth_pipelining smtpd_sender_restrictions = permit_mynetworks, warn_if_reject reject_non_fqdn_sender, reject_unknown_sender_domain, reject_unauth_pipelining, permit smtpd_soft_error_limit = 3 smtpd_tls_cert_file = /etc/ssl/certs/ssl-cert-snakeoil.pem smtpd_tls_key_file = /etc/ssl/private/ssl-cert-snakeoil.key smtpd_tls_session_cache_database = btree:${data_directory}/smtpd_scache smtpd_use_tls = yes unknown_local_recipient_reject_code = 450 virtual_alias_maps = mysql:/etc/postfix/mysql_alias.cf virtual_gid_maps = mysql:/etc/postfix/mysql_gid.cf virtual_mailbox_base = /var/spool/mail/virtual virtual_mailbox_domains = mysql:/etc/postfix/mysql_domains.cf virtual_mailbox_maps = mysql:/etc/postfix/mysql_mailbox.cf virtual_uid_maps = mysql:/etc/postfix/mysql_uid.cf

    Read the article

  • Separate urls for a set of pages sharing 80% duplicate content

    - by user131003
    Issue: Currently my site has one particular page which has country specific data. So I've URLs like : mysite.com/sale-united-states mysite.com/sale-united-kingdom mysite.com/sale-sweden etc. All these pages have 80-90% common content and 10-20% country specific content. currently all these pages canonically point to mysite.com/sale-united-states. The problem is when someone searches for "sale Sweden", Google correctly shows mysite.com/sale-united-states page, which does not feel correct as it shows US page instead of Sweden. Now I'm thinking of not using canonical url so that country specific urls are produced in Google saerch. But I'm not sure how 80% duplicate content is going to affect SEO? What should be the recommended approach for this situation? A friend of mine suggested a "separate subdomain per country" based approach but it seems overkill for one page.

    Read the article

  • How to prevent a 404 Error when installing a subdomain using a wordpress multi-site installation

    - by Chris
    I have installed a multi-site instillation of WordPress onto my domain. I then added the necessary code to the wp-config.php file and .htaccess as instructed by WordPress. I also installed a plugin called Quick Page/Post Redirect Plugin which allowed me to place a 301 redirect onto the main domain as I only want to use the sub domain and not the main domain. Then I also added the following line of code to the wp-config.php file to redirect the main domain define( 'NOBLOGREDIRECT', 'URL Redirect Address' ); The site works fine with a redirect on the main domain and my subdomain runs fine when you type in subdomain.domain.com or http://subdomain.domain.com. However when I enter www.subdomain.domain.com or http://www.subdomain.domain.com the following error message is returned: Not Found The requested URL / was not found on this server. Apache/2.4.9 (Unix) Server at www.subdomain.domain.com Port 80 Any help with this would be much appreciated.

    Read the article

  • Two different websites in one remote hosting

    - by Kor
    My client asked me that a website that is hosted in one server (and pointing there through a domain) should also be accessed (into a specific directory) from another domain, which is not pointing there. For example: http://www.foo.com, hosted at GoDaddy, with the full website http://www.bar.com, hosted at Bluehost, needs to access http://www.foo.com/bar, as if it was the http://www.bar.com's root. So, if anybody enters through http://www.bar.com, it should internally load http://www.foo.com/bar, without visually changing the url. I amb not sure if this is possible using .htaccess or anything like this. Could anybody show me some light? Thanks in advance

    Read the article

  • I Purchased a Domain that Previously had a Google Apps Account. How Do I Re-Create The Google Apps Account or Take Ownership of it?

    - by jmort253
    I recently purchased a domain name. We'll call it example.com. The previous owner of the domain had a Google Apps account. Now that I own the domain, I want to create a Google Apps account so I can point the domain www.example.com to one of my Google App Engine domains. We'll call it application.appspot.com. Google App Engine won't allow me to add the domain without verifying ownership by creating a Google Apps account or logging into Google Apps, but I don't have access to the old Google Apps account. We've tried going to this address to take ownership: https://www.google.com/a/cpanel/example.com/ResetAdminPassword?c=LONG_KEY&hl=en_US We retrieved a new password, but it wouldn't tell us what the login name is. How do you find out the login name?

    Read the article

  • Why subdomains of Blogspot/WordPress like sites are treated as different domains or sites?

    - by Thedijje
    As I know, maps.google.com or mail.google.com all comes under the same domain and its all are subdomain. Entire web treats these subdomain as the part of main domain and they have same Alexa rank, PageRank and all. But in another hand, take a look on blogspot.com/wordpress.com/webs.com; these are different sites but blogs or websites under those domains are treated as different sites. Its new URL, all have different PageRank and Alexa rank as well. Tts about millions of subdomains under those few domain, have almost similar IP address, hosting and CMS, still why they are called different domains?

    Read the article

  • Google page rank not showing after redirecting www to non www?

    - by muhammad usman
    i have a fashion website. i had redirected my domain htttp:// (non www) to http:// www domain and my preferred domain in Google webmaster tools was http:// www. Now i have redirected http:// www to http:// domain and have changed my prefered domain as well. Now Google PageRank is not showing for even a single page. Would any body please help me and let me know if i have done something wrong? below is my htaccess redirect code RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] RewriteCond %{HTTP_HOST} ^www\.deemasfashion\.com$ RewriteRule ^deemasfashion\.com/?(.*)$ http://deemasfashion.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.html\ HTTP/ RewriteRule ^index\.html$ http://deemasfashion.com/ [R=301,L] RewriteRule ^index\.htm$ http://deemasfashion.com/ [R=301,L]

    Read the article

  • "Malformed line 6" error in my /etc/apt/sources.list

    - by Odi1215
    I'm new to Ubuntu so I don't really know much yet. I encountered this problem while on the terminal: E: Malformed line 6 in source list /etc/apt/sources.list (dist parse) E: The list of sources could not be read. What should I do? Help would be much appreciated. Here's my source.list: # /etc/apt/sources.list deb http://archive.ubuntu.com/ubuntu/ precise main restricted universe multiverse deb http://security.ubuntu.com/ubuntu/ precise-security main restricted universe multiverse deb http://archive.ubuntu.com/ubuntu/ precise-updates main restricted universe multiverse deb http://archive.canonical.com/ partner deb-src http://archive.canonical.com/ partner /etc/apt/sources.list

    Read the article

  • How to approach iOS web clip app "download"?

    - by Ryan
    We have our main site at: http://mysite.com which we don't want to alter. Then we have our web clip app at: http://mysite.com/app/ If someone visits the app URL in normal Safari then the Safari UI will still display. But if the user adds the app URL to their home screen, and then they tap that icon they will launch the app URL without the Safari UI as intended. My question is how do you go about getting the user to use the web clip app from their home screen when they start from mysite.com? What I'm thinking is that we have a link on mysite.com that points to mysite.com/app/. Then when they click that /app/ link they'll go to the app but it won't be in "app mode". Can I detect that it's not in app mode and display a message like "add this page to your home screen to use the app"? And then when they do visit in app mode obviously just let the app run.

    Read the article

  • Rewrite rule to show as directory using .htaccess

    - by chanchal1987
    I want to implement a rewrite rule in my .htaccess file to show a specific url as a directory of my server. See the code below I written, RewriteRule ^(.*)/$ ?page=$1 [NC] This will rewrites urls like www.mysite.com/abc/ to www.mysite.com/index.php?page=abc. But if I request www.mysite.com/abc then it is throwing an 404 error. How can I write a rewrite rule which will match www.mysite.com/abc and www.mysite.com/abc/ both? Edit: My current .htaccess file (After Litso's answer's 3rd revision) is like below: ## ErrorDocument 401 /index.php?error=401 ErrorDocument 400 /index.php?error=400 ErrorDocument 403 /index.php?error=403 ErrorDocument 500 /index.php?error=500 ErrorDocument 404 /index.php?error=404 DirectoryIndex index.htm index.html index.php RewriteEngine on RewriteBase / Options +FollowSymlinks RewriteRule ^(.+)\.html?$ $1.php RewriteCond !-d RewriteRule ^(.*)/$ ?page=$1 [NC,L] RewriteCond %{REQUEST_URI} !index.php RewriteRule ^(.*)$ ?page=$1 [NC,L] ##

    Read the article

  • Switch to https

    - by Mike
    I'm looking to use an .htaccess file to use mod_rewrite to switch the protocol from http:// to https:// when someone hits my website. For instance, once someone goes to: http://www.mywebsite.com/ I'd like the browser to switch to: http*s*://www.mywebsite.com/ The same goes for the http://mywebsite.com/ - https://mywebsite.com This is the following code I've been using and I've experienced some odd things so if anyone could provide me with information if this is the right way to do it, or if you have a better way, please provide it. Thanks in advance. RewriteEngine On RewriteCond %{SERVER_PORT} !=443 RewriteRule ^(.*)$ https://www.ebaillv.com/$1 [R=301,L]

    Read the article

  • Google doesn't index a subdomain. What can be the problem and what can be done?

    - by fudge
    Hi! I have a domain, let's call it example.com, which has a subdomain, games.example.com. I maintain a games forum using phpbbseo which is located at games.example.com/forum. The problem is that the forum is not being crawled. I used Google's webmaster tools and tested that the page is seen by google. P.S. There is a link from games.example.com to games.example.com/forum. What can I do? How can I make google crawl my forum?

    Read the article

  • Working with different URL structures

    - by Dane411
    As I'm quite newbie to this field, I've doubts and there are some I couldn't find on Google, i.e: If I'm not wrong, index.html makes it possible to avoid to add the filename to the url, www.example.com/ is equal to www.example.com/index.html. And that works for the following subdirectories, right? www.example.com/music/ Is there any other way to achieve this without using an index.html file? (I've read smth about converting dynamic urls to static: ./?var1=value1&varN=valueN - ./value1/valueN) How can I convert www.example.com/music/ to music.example.com/ and why should it be used? Thanks in advance!

    Read the article

  • Juju instances in aganet-state: down after turning them off (and back on) on EC2

    - by Tyler McAdams
    I turned my Juju instances off on EC2 for a while and after bringing them back online they seem to be in an odd state: [code] claude-vm@claude-vm-fusion:~/Documents/Shell Scripts$ juju status 2012-11-17 17:06:44,094 INFO Connecting to environment... 2012-11-17 17:06:45,590 INFO Connected to environment. machines: 0: agent-state: not-started dns-name: ec2-54-242-142-196.compute-1.amazonaws.com instance-id: i-b0996fcf instance-state: running 1: agent-state: down dns-name: ec2-50-19-186-245.compute-1.amazonaws.com instance-id: i-8c8375f3 instance-state: running 2: agent-state: down dns-name: ec2-54-242-255-238.compute-1.amazonaws.com instance-id: i-56807629 instance-state: running services: wordpress: charm: cs:precise/wordpress-9 exposed: true relations: db: - wordpress-db loadbalancer: - wordpress units: wordpress/0: agent-state: down machine: 2 open-ports: - 80/tcp public-address: ec2-54-242-227-57.compute-1.amazonaws.com wordpress-db: charm: cs:precise/mysql-10 relations: db: - wordpress units: wordpress-db/0: agent-state: down machine: 1 public-address: ec2-54-242-212-177.compute-1.amazonaws.com 2012-11-17 17:06:47,274 INFO 'status' command finished successfully [/code] Can I not take my instances down for a while? Or is this something else?

    Read the article

  • htaccess 301 redirect for payment page

    - by Chris Robinson
    I have a client who currently runs a venue and has ticket purchases made available through a third party. The way the site currently works is that there is a standard href in the nav menu to the ticket purchasing site. <a href'http://example.com/events'>Events</a> <a href'http://example.com/about'>About</a> <a href'https://someticketvendor.com/myclient?blah'>Tickets</a> They claim that they want to improve their SEO by appearing to integrate the ticket pages into their site. Having spoken to the ticket vendor, they only offer integration through iframes which is just horrible. I don't really know much about SEO but I'm wondering if I can create an htaccess rule to have http://example.com/tickets forward to href'https://someticketvendor.com/myclient?blah Are they are any negative SEO implications to doing this? Is there a better way this could be done?

    Read the article

  • NDepend v4 has just been released!

    - by Vincent Maverick Durano
    Few months ago I blogged about the release of NDepend v3 Continuous Integration and Reporting Capabilities here. Recently, the NDepend team has released v4 which comes with code rules based on C# LINQ queries (CQLinq), this make code ruling so much more powerful and flexible. There are couple of new rules available like: http://www.ndepend.com/DefaultRules/webframe?Q_UI_layer_shouldn't_use_directly_DB_types.html http://www.ndepend.com/DefaultRules/webframe?Q_Types_with_disposable_instance_fields_must_be_disposable.html http://www.ndepend.com/DefaultRules/webframe?Q_Avoid_the_Singleton_pattern.html http://www.ndepend.com/DefaultRules/webframe?Q_Avoid_making_complex_methods_even_more_complex_(Source_CC).html v4 also provides NDepend.API and a dozen of open-source code tool developed with NDepend.API (the Power Tools) http://www.ndepend.com/API/webframe.html

    Read the article

  • Google Analytics: tracking subdomains for a profile defined for a subdomain

    - by Alex G
    Hope you can help. We have set up a single property under our Google Analytics account. That property's default URL is set to subdomain1.example.com. We would now like to track multiple subdomains for example.com, under the same property. Seems easy enough: we just need to add _gaq.push(['_setDomainName', 'example.com']); to our tracking code, right? But my question is: does it matter if a) we don't need to track www.example.com (this is tracked under a seperate account and property) and b) the default URL for our property is set to subdomain1.example.com? Will either of these have any impact on data collection?

    Read the article

< Previous Page | 164 165 166 167 168 169 170 171 172 173 174 175  | Next Page >