Search Results

Search found 4781 results on 192 pages for 'seo audit'.

Page 45/192 | < Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >

  • different data with same title and keywords

    - by Junaid Saeed
    here is my scenario i have a website where i redirect my users basing upon the device they were using, lets say a user is visiting from an iPad, i take him directly to the page of iPad wallpapers, the user selects iPad version & i take the user to the gallery of wallpapers where the user can select & download any wallpaper. Every wallpaper is the required resolution, i have my reasons for doing this, now the thing is there are diff. resolution. versions of an image appearing one 5 diff. sections of my website, each having their own view page Now there is only one record in db.table for the image, and basing on the my consistent naming convention of the images, i pick the required image. this means when 5 different pages are generated in 5 categorized sections of the website, due to a shared DB record, the keywords, the titles and every single detail of the 5 pages is same besides the resolution of the image, and the section specific details that the page has and yeah the pages also have different paths like wallpapers.com\ipad-1\cars\Ferrari-dino.html wallpapers.com\ipad-2\cars\Ferrari-dino.html wallpapers.com\ipad-3\cars\Ferrari-dino.html wallpapers.com\ipad-4\cars\Ferrari-dino.html wallpapers.com\ipad-5\cars\Ferrari-dino.html now this is my scenario, How do Search Engines see it and how do they rank it? Is it a Good or Normal or Bad SEO practice? If bad how dangerous it is for my sites SEO? i need your comments on my scenario.

    Read the article

  • Why do some bad websites rank well?

    - by BradB
    Consider the following scenario: you are pitching SEO/Website Optimisation to a prospective client and you explain to them the importance of great copy and content, how acquiring links (ethically) can increase page rank, why the quality of the HTML build matters (H1, H2 tags, w3c validation etc), why keyword research is beneficial, you may drop in a few Google Webmaster Guideline or Matt Cutts references to back up your claims and rubbish the "back hat" approach as being no longer effective for good measure. Your advice is ethical and in the eyes of best practices, spot on. Then, the client points out to you some of their long established competitors on Google and you see these competitor websites ranking in the top spots (1 to 3) for medium to highly competitive search phrases that your client wants to compete for. These websites totally contradict your ethical approach and pretty much violate every best practice previously noted. They even out perform other "white hat" competitors who are in accordance with the above guidelines. I experienced this today. One of these well ranking websites had: About six microsites with more or less the same copy and a slightly varied layout Little or not textual content I would almost say duplicate content across the sites, but there was so little of it it could barely qualify for being duplicate All the content in Flash (with a music track that kicked in on each page load, not so much of an SEO issue - but it helps paint the picture) Keyword stuffing behind the Flash file with a bunch of black text on black background in the style of keyword 1 keyword 2,keyword1,keyword 2,keyword 2 keyword 3 and so on... The exact keyword stuffed combination present on every page of the website A bunch of clearly self made links from poor quality forums and directories with little or no Page Rank Links exchanged across the microsites How do you explain your way out of this when this hard evidence is sat in front of you undermining your great pitch?

    Read the article

  • Transfer page from internal to external

    - by Theo Gulland
    Afternoon all! Currently I have a website with a list of audio products (essentially a search engine for audio deals). http://www.soundplaza.co.uk Once you go to the details page, you can then press the 'view deal' button to go to providers site e.g. = http://www.soundplaza.co.uk/all-deals/113/bookshelf-speakers/acoustic-energy-1 This jump between two sites is a bit harsh and I would like to show a transition page, to simply ease them into another site and not scare them off. Within this tradition page I will have a simple loading gif and some graphics showing that your transferring. QUESTION: What is the best way to send the details (link, product name etc) to this transfer page, to then wait 5 seconds, to then move on to the desired link... this can in NO WAY damage my SEO, if anything rel="nofollow" would be great if possible. Currently I have seen that you can submit form to the transition page, then you can use php sleep and then php header to transfer... however I am not sure if php header will transfer SEO value tot he provider? Any opinions would be great! Thanks

    Read the article

  • Google results show .info domain instead of .com

    - by user481913
    I am on shared hosting currently and i registered this account with a .info domain as the main domain.... say MyDomain.info . However, the site runs from MyDomain.com . This is a cpanel based shared hosting account. MyDomain.info has nothing hosted at all... i.e no content files... MyDomain.com is setup as an Add On Domain and run from /public_html/MyDomain under MyDomain.info The problem is that when i type MyDomain as the keyword for search in Google , it shows result(s)for Mydomain.info although this is not the intended site and has no content hosted on itself. I tried to solve the issue by issuing a 301 permanent redirect from MyDomain.info to MyDomain.com, however Google keeps on displaying results as mydomain.info as the main site even after 1 month of the redirect. I want google to index MyDomain.com as the main site and remove MyDomain.info from the results. Also is this harmful from the seo point of view? How can i improve the seo if it is?

    Read the article

  • guidline for promoting a web forum or portal

    - by Hafiz
    I am a web application developer, have developed a lot of sites, portal and apps for clients. Now I want to have own such sites that with which I can do business. But I don't know what are steps to do so. What I know is make a site or portal. But what after that? There are lot of people having so much traffic on sites built with some simple open source. Many of them are forums. I also wanted to start a forum and a web portal. I can develop that or can have open source. But what after that? Content entry and SEO? Is it all to promote a portal or site? Do SEO nowadays work ? or it is all about marketing and advertising? I have no idea about that so please tell what you guys suggest. thanks for every one's opinion in advance.

    Read the article

  • Naming your website longname.com vs shortcatchy.net vs shortcatchy.info

    - by jskye
    I'm designing a website that will basically be a social network for sharing information. I have the domain $$$$d.net and the same domain $$$$d.info where $$$$ is a word (that runs into the d) pertaining to the purpose of the site . The .com of this domain was already taken, but they've got nothing showing. They only have a not reached google error showing ie. dont seem to be trying to sell it either. I also have the long name of the site $$$$------&&&&&&&&&.com where the words $$$$ and &&&&&&&&& would contribute relevant seo to the site. In fact the word $$$$------ would also if a one letter spelling mistake is recognised at all by google, which i doubt but am unsure about. But as a brandname the $$$$------ word still works relevantly. Which do you think is a better choice to use? The short catchy name with the .info for relevance to information The .net which is more familiar than .info but slightly less relevant maybe. (But i think net as in network still works cos as i said it will be a social networking site). The long, .com domain which has more SEO plus a pun albeit on a spelling mistake. I know its kind of a subjective question and also hard to answer without knowing the name (which I've obfuscated because I'm only in initial design stage) but nevertheless im interested in what some of you guys think.

    Read the article

  • How to target just one search engine and optimise for that

    - by mickburkejnr
    I've been dabbling with SEO a lot in the last 6 months, and one thing that has surprised me is the disparity between Google and Bing in the way they deliver results. A website ranked for a specific keyword/phrase on Google may rank 3rd on the first page, but using the same keyword/phrase on Bing will display the same website but ranked 15th for the exact same keyword/phrase. I came up with the idea to increase traffic to my website by targetting Bing instead of Google for several reasons. The biggest one is that while it's not the biggest search provider, people still use it, and I feel that if other websites have been "neglected" and not optimised for Bing my website would stand a better chance of getting near the top of their search rankings. The question is though how would I do this? A lot of the SEO advice on the internet is generic, but I can't help feeling it's Google orientated for obvious reasons. How could I optimise my website to be Bing friendly, rather than Google friendly? I know it sounds like suicide as I'm taking myself out of the Google mindset, but I feel it could work wonders for traffic to the site.

    Read the article

  • Want Google to index redirect urls

    - by Dave Goten
    I'm having issues with users who think that Google Search is the address bar. Some of the sites that link to my site use user friendly addresses with 301 redirects to pages that have less friendly URLs. So, for example if I enter www.foo.com/bar it goes to www.bar.com/page.php?some-parameters-and-utm-codes-etc usually this is done by a 301 redirect in order to keep the SEO from foo.com on bar.com and so on, which I believe is standard practice. However, lately there have been more and more people searching www.foo.com/bar instead of going to www.foo.com/bar directly and because the page /bar is nothing more than a redirect it has no SEO that I know of. Things I've thought of but haven't been able to test, because Google takes forever to update :) (and I'm lazy like that), include using Google sitemaps and having them enter their redirects as entries there. (I could see this working if they were the top search entry all the time, and it might appear as a sitelink, but I don't know if that'll make the url itself show up in searches) Using Canonical tags on my pages to the redirects they set up. Which is a nightmare in itself because of the nature of my pages. One week the www.foo.com/bar might go to www.bar.com/pageA.php the next it might goto www.bar.com/pageB.php and having to remember to take the canonical tag off of pageA, so that it doesn't get confused with pageB would be a pain. Using 302 redirects -.- So I guess the question here is, does anyone have any experience or knowledge about this? What should I do to make www.foo.com/bar show up when someone 'searches' for this redirect url?

    Read the article

  • What could have caused a large traffic drop from Google in early May?

    - by Scott Schluer
    I have a website (www.equispot.com) that has been indexed for almost 2 years in Google. I managed to get myself on the first page (average position 6-8) on Google for my target keyword of "horses for sale" and held there pretty solidly for months. Suddenly, with no changes to the site, traffic from Google dropped like a rock in early May. I slowly fell in position until now I'm sitting at the bottom of page 4. I have never hired an SEO firm, have not used any "black hat" techniques that Google would have penalized me for in their May update, etc. I'm not familiar enough with SEO to know how to look at link profiles, etc. to tell if there's something wrong. I've run my site through a DNS checker and it came back with no errors. Google Webmaster Tools shows no messages or notices of any kind, just a drop in traffic. GWT also shows only 2 server errors and 1 404. Is there anyone who can tell me by quickly checking my domain if there's an obvious reason that my traffic would have fallen so far, something that I can fix?

    Read the article

  • Multiple domains for different products?

    - by alexandertr
    I have a website with software applications. Is it good for SEO to choose one keyword rich domain name for each of our software products or should we stick to a single domain? From a user's perspective I think it would be easier to remember a domain that is keyword rich as the user will instantly know what this product is for. But I have read articles that the latest trend in SEO is to stick to one domain for all of your products and invest on this single domain website. Is that true? What do you advise? Should I register a separate domain for each of our products or should I use only one single domain? Should I do a 301 redirect with a .htaccess to a single domain? And what about the sitemaps? Should I register all sites in Google Webmaster Tools and post a separate sitemap for each one of them? should my main site sitemap include all pages or should separate domains have their own sitemaps?

    Read the article

  • Checking the configuration of two systems to determine changes

    - by None
    We are standing up a replicant data center at work and need to ensure that the new data center is configured (nearly) identically to the original. The new data center will be differently addressed and named than the original and will have differing user accounts, but all the COTS, patches, and configurations should be the same. We would normally ghost the original servers and install those images onto the new machines, however, we have a few problematic pieces of COTS that require we install them outside of an image due to how they capture the setup of the network during their installation and maintain it within their configuration information (in some cases storing it in various databases). We have tried multiple times and this piece of COTS cannot be captured within a ghost image unless the destination machine will have an identical network setup (all the same IPs, hostnames, user accounts, etc across the entire network) as the original. In truth, it is the setup of these special COTS that I want to audit the most because they are difficult to install and configure in the first place. In light of the fact that we can’t simply ghost, I’m trying to find a reasonable manner to audit the new data center and check to see if it is setup like the original (some sort of system wide configuration audit or integrity check). I’m considering using something like Tripwire for Servers to capture the configuration on the source machines and then run an audit on the destination machines. I understand that it will still show some differences due to the minor config changes, but I’m hoping that it will eliminate the majority of the work. Here are some of the constraints I’m working under: Data center is comprised of multiple Windows and Linux machines of differing versions (about 20 total) I absolutely cannot ghost or snap any other type of image of these machines … at least not in their final configuration I want to audit the final configuration to ensure all of the COTS, patches, configurations, etc are installed and setup properly (as compared to the original data center) I would rather not install any additional tools on these machines … I’d much rather run it from a standalone machine or off a DVD Price of tools is important but not an impossible burden, however, getting a solution soon is important (I can’t take the time to roll my own tools to do this) For the COTS that stores the network information, I don’t know all of the places it stores the network information … so it would be unlikely I could find a way in the near future to adjust its setup after the installation has occurred Anyone have any thoughts or alternate approaches? Can anyone recommend tools that would be usable for system wide configuration audits?

    Read the article

  • How to record different authentication types (username / password vs token based) in audit log

    - by RM
    I have two types of users for my system, normal human users with a username / password, and delegation authorized accounts through OAuth (i.e. using a token identifier). The information that is stored for each is quite different, and are managed by different subsytems. They do however interact with the same tables / data within the system, so I need to maintain the audit trail regardless of whether human user, or token-based user modified the data. My solution at the moment is to have a table called something like AuditableIdentity, and then have the two types inheriting off that table (either in the single table, or as two seperate tables with 1 to 1 PK with AuditableIdentity. All operations would use the common AuditableIdentity PK for CreatedBy, ModifiedBy etc columns. There isn't any FK constraint on the audit columns, so any text can go in there, but I want an easy way to easily determine whether it was a human or system that made the change, and joining to the one AuditableIdentity table seems like a clean way to do that? Is there a best practice for this scenario? Is this an appropriate way of approaching the problem - or would you not bother with the common table and just rely on joins (to the two seperate un-related user / token tables) later to determine which user type matches which audit records?

    Read the article

  • How to keep historic details of modification in a database (Audit trail)?

    - by mada
    I'm a J2EE developer & we are using hibernate mapping with a PostgreSQL database. We have to keep track of any changes occurs in the database, in others words all previous & current values of any field should be saved. Each field can be any type (bytea, int, char...) With a simple table it is easy but we a graph of objects things are more difficult. So we have, speaking in a UML point of view, a graph of objects to store in the database with every changes & the user. Any idea or pattern how to do that?

    Read the article

  • Is there an API to remotely read a Windows machine's audit configuration?

    - by JCCyC
    I need to know, for each subcategory, whether it'll be audited on success, on failure, both, or none. Below is an example of the information I need to collect. Can I get this through WMI? Or if not, by other means, assuming I have proper (admin) credentials to the target machine? Again, to clarify, it's not the event log I need to read, it's the logging configuration. <security_state_change>AUDIT_SUCCESS</security_state_change> <security_system_extension>AUDIT_NONE</security_system_extension> <system_integrity>AUDIT_SUCCESS_FAILURE</system_integrity> <ipsec_driver>AUDIT_NONE</ipsec_driver> <other_system_events>AUDIT_SUCCESS_FAILURE</other_system_events> <logon>AUDIT_SUCCESS</logon> <logoff>AUDIT_SUCCESS</logoff> <account_lockout>AUDIT_SUCCESS</account_lockout> <ipsec_main_mode>AUDIT_NONE</ipsec_main_mode> <ipsec_quick_mode>AUDIT_NONE</ipsec_quick_mode> <ipsec_extended_mode>AUDIT_NONE</ipsec_extended_mode> <special_logon>AUDIT_SUCCESS</special_logon> <other_logon_logoff_events>AUDIT_NONE</other_logon_logoff_events> <file_system>AUDIT_NONE</file_system> <registry>AUDIT_NONE</registry> <kernel_object>AUDIT_NONE</kernel_object> <sam>AUDIT_NONE</sam> <certification_services>AUDIT_NONE</certification_services> <application_generated>AUDIT_NONE</application_generated> <handle_manipulation>AUDIT_NONE</handle_manipulation> <file_share>AUDIT_NONE</file_share> <filtering_platform_packet_drop>AUDIT_NONE</filtering_platform_packet_drop> <filtering_platform_connection>AUDIT_NONE</filtering_platform_connection> <other_object_access_events>AUDIT_NONE</other_object_access_events> <sensitive_privilege_use>AUDIT_NONE</sensitive_privilege_use> <non_sensitive_privlege_use>AUDIT_NONE</non_sensitive_privlege_use> <other_privlege_use_events>AUDIT_NONE</other_privlege_use_events> <process_creation>AUDIT_NONE</process_creation> <process_termination>AUDIT_NONE</process_termination> <dpapi_activity>AUDIT_NONE</dpapi_activity> <rpc_events>AUDIT_NONE</rpc_events> <audit_policy_change>AUDIT_SUCCESS</audit_policy_change> <authentication_policy_change>AUDIT_SUCCESS</authentication_policy_change> <authorization_policy_change>AUDIT_NONE</authorization_policy_change> <mpssvc_rule_level_policy_change>AUDIT_NONE</mpssvc_rule_level_policy_change> <filtering_platform_policy_change>AUDIT_NONE</filtering_platform_policy_change> <other_policy_change_events>AUDIT_NONE</other_policy_change_events> <user_account_management>AUDIT_SUCCESS</user_account_management> <computer_account_management>AUDIT_NONE</computer_account_management> <security_group_management>AUDIT_SUCCESS</security_group_management> <distribution_group_management>AUDIT_NONE</distribution_group_management> <application_group_management>AUDIT_NONE</application_group_management> <other_account_management_events>AUDIT_NONE</other_account_management_events> <directory_service_access>AUDIT_NONE</directory_service_access> <directory_service_changes>AUDIT_NONE</directory_service_changes> <directory_service_replication>AUDIT_NONE</directory_service_replication> <detailed_directory_service_replication>AUDIT_NONE</detailed_directory_service_replication> <credential_validation>AUDIT_NONE</credential_validation> <kerberos_ticket_events>AUDIT_NONE</kerberos_ticket_events> <other_account_logon_events>AUDIT_NONE</other_account_logon_events>

    Read the article

  • SEO: It's recommended to upload and put live a beta / non-finished version of web site??

    - by Jonathan
    I'm working on this big website and I want to put it online before its fully finished... I'm working locally and the database is getting really big so I wanted to upload the website and continue to work on it in the server, but allowing people to enter, so I can test. The question is if this is good for SEO, I mean, there are a lot of things SEO related that are incomplete.. For example: there are no friendly URLs, no sitemap, no .htacces file, lot of 'in-construction' sections... Does Google will penalize me forever? How does it works? Google indexes adn get the structure of the site just once or its constantly updating and checking for changes??? What i should do? What you recommend?!!?

    Read the article

  • How to Audit Database Activity without Performance and Scalability Issues?

    - by GotoError
    I have a need to do auditing all database activity regardless of whether it came from application or someone issuing some sql via other means. So the auditing must be done at the database level. The database in question is Oracle. I looked at doing it via Triggers and also via something called Fine Grained Auditing that Oracle provides. In both cases, we turned on auditing on specific tables and specific columns. However, we found that Performance really sucks when we use either of these methods. Since auditing is an absolute must due to regulations placed around data privacy, I am wondering what is best way to do this without significant performance degradations. If someone has Oracle specific experience with this, it will be helpful but if not just general practices around database activity auditing will be okay as well.

    Read the article

  • How to handle #(hash) character in SEO friendly url?

    - by arvinsim
    How do you bypass the default behaviour if #(hash) which is to go to a specific part of a page? The problem that I have is that the # character is a part of the SEO friendly url which is a title and the #(hash) is part of the content (i.e. like with C#). I can't retrieve the whole string and I only get the characters before the #. Example: www.domain.com/C#-programming-book in this example I only get 'C' and not the '-programming-book' part. I am not using any javascript at the moment and would like to only use a PHP solution for this. Before anyone suggests that I used url encoding, the criteria for the seo friendly url is that it should be human readable and easily remembered. So converting the hash to '%23' does not pass the criteria. Is there no way around it?

    Read the article

  • Zabbix Trigger for SELinux (type=AVC) Errors

    - by Kevin Soviero
    I would like to create a trigger in Zabbix to alert me anytime a type=AVC error appears in a CentOS 6 server's /var/log/audit/audit.log file. I've already tried creating a basic log scrape. E.g.: log[/var/log/audit/audit.log,type=AVC,"UTF-8",100] However, it does not work. I believe this is due to the /var/log/audit/audit.log and it's parent folder using the following permissions: drwxr-x---. 2 root root 4096 Apr 20 04:29 . drwxr-xr-x. 13 root root 4096 Apr 14 12:07 .. -rw-------. 1 root root 5948185 Apr 20 15:27 audit.log -r--------. 1 root root 6291566 Apr 20 04:29 audit.log.1 -r--------. 1 root root 6291704 Apr 19 16:56 audit.log.2 -r--------. 1 root root 6291499 Apr 19 05:22 audit.log.3 -r--------. 1 root root 6291552 Apr 18 17:48 audit.log.4 I would prefer not to change the permissions for security reasons. Has anyone done log monitoring of /var/log/audit/audit.log using Zabbix? And if so, how?

    Read the article

  • Diagnosing Logon Audit Failure event log entries

    - by Scott Mitchell
    I help a client manage a website that is run on a dedicated web server at a hosting company. Recently, we noticed that over the last two weeks there have been tens of thousands of Audit Failure entries in the Security Event Log with Task Category of Logon - these have been coming in about every two seconds, but interesting stopped altogether as of two days ago. In general, the event description looks like the following: An account failed to log on. Subject: Security ID: SYSTEM Account Name: ...The Hosting Account... Account Domain: ...The Domain... Logon ID: 0x3e7 Logon Type: 10 Account For Which Logon Failed: Security ID: NULL SID Account Name: david Account Domain: ...The Domain... Failure Information: Failure Reason: Unknown user name or bad password. Status: 0xc000006d Sub Status: 0xc0000064 Process Information: Caller Process ID: 0x154c Caller Process Name: C:\Windows\System32\winlogon.exe Network Information: Workstation Name: ...The Domain... Source Network Address: 173.231.24.18 Source Port: 1605 The value in the Account Name field differs. Above you see "david" but there are ones with "john", "console", "sys", and even ones like "support83423" and whatnot. The Logon Type field indicates that the logon attempt was a remote interactive attempt via Terminal Services or Remote Desktop. My presumption is that these are some brute force attacks attempting to guess username/password combinations in order to log into our dedicated server. Are these presumptions correct? Are these types of attacks pretty common? Is there a way to help stop these types of attacks? We need to be able to access the desktop via Remote Desktop so simply turning off that service is not feasible. Thanks

    Read the article

  • SELinux - Allow multiple services access to same /home/dir

    - by Mike Purcell
    I currently have SELinux enabled and have been able to configure apache to allow access to /home/src/web with a chcon command granting the 'httpd_sys_content_t' type. But now I am trying to serve the rsyslogd.conf file from the same directory, but every time I start rsyslogd I see an entry in my audit log saying that rsyslogd was denied access. My question is, is it possible to grant two applications the ability to access the same directory, while still keeping SELinux enabled? Current perms on /home/src: drwxr-xr-x. src src unconfined_u:object_r:httpd_sys_content_t:s0 src Audit log message: type=AVC msg=audit(1349113476.272:1154): avc: denied { search } for pid=9975 comm="rsyslogd" name="/" dev=dm-2 ino=2 scontext=unconfined_u:system_r:syslogd_t:s0 tcontext=system_u:object_r:home_root_t:s0 tclass=dir type=SYSCALL msg=audit(1349113476.272:1154): arch=c000003e syscall=2 success=no exit=-13 a0=7f9ef0c027f5 a1=0 a2=1b6 a3=0 items=0 ppid=9974 pid=9975 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="rsyslogd" exe="/sbin/rsyslogd" subj=unconfined_u:system_r:syslogd_t:s0 key=(null) -- Edit -- Came across this post, which is sort of what I am trying to accomplish. However when I viewed the list of allowed sebool params, the only relating to syslog was: syslogd_disable_trans (SELinux Service Protection), seems like I can maintain the current SELinux 'type' on the /home/src/ dir, but set the bool on syslogd_disable_trans to false. I wonder if there is a better approach?

    Read the article

  • Does hosting multiple sites on one server hurt your SEO?

    - by MarathonStudios
    I have a handful of (content unrelated) sites with decent PRs and I'm considering hosting them all on the same server. I've heard that if you do this, internal linking between two seperate domains on that server may be seen as less "valid" by Google in PageRank terms (since you obviously own both of the sites as they share an IP address). Anyone have any experience in this? I'd love to save some hosting cash by consolidating, but not at the expense of losing the ability to link my sites together powerfully.

    Read the article

  • Can anyone recommend a Google SERP tracker?

    - by Haroldo
    I want to track my website's position in Google's search results for around 50 keywords/phrases and I am looking to a nice web service or Windows application to automate this process. Ideally, I want to see pretty Javascript or Flash line graphs for my keywords and their positions. I'm currently free-trialing Raven Tools and Sheer SEO but I am not particularly impressed with either. My budget is up to £25-30/$30-40 per month for a decent rank checker.

    Read the article

  • Is there such a thing as a Google Result Set simulator?

    - by Dave Adams
    I am always making tweaks to my site, be it in the .htaccess file, some new SEO plugin, different types of content or whatever. For all these changes, I would really like to be able test it immediately and see if the change had any positive or negative effect. I am just wondering if there was some way of doing immediate testing using some simulator instead of having to wait for Google to discover and index it - which could take a long time.

    Read the article

  • Interesting links week #10

    - by erwin21
    Below a list of interesting links that I found this week: Interaction: The Ultimate 20 Usability Tips for Your Website Frontend: Adobe Releases Flash-to-HTML5 Converter, Codenamed Wallaby Development: 10 Tips for Decreasing Web Page Load Times Ten Things Every WordPress Plugin Developer Should Know Progressive enhancement tutorial with ASP.NET MVC 3 and jQuery Marketing: 5 Tips for SEO & User-Friendly Copy Other: Interested in more interesting links follow me at twitter http://twitter.com/erwingriekspoor

    Read the article

  • Blocking Just the Parent Domain via robots.txt

    - by Bryan Hadaway
    Let's say you have a parent domain: parent.com and children subdomains under that parent domain: child1.com child2.com child3.com Is there a way to use just the following within parent.com: User-agent: * Disallow: / Considering each child has their own robots.txt stating: User-agent: * Allow: / Or is the parent robots.txt still going to have to make an exception for every single subdomain: User-agent: * Disallow: / Allow: /child1/ Allow: /child2/ Allow: /child3/ Obviously this is important and tricky territory SEO wise so I'm looking to learn the definitive and safe, best practice method here to sharpen my skills. Thanks, Bryan

    Read the article

< Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >