Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 192/592 | < Previous Page | 188 189 190 191 192 193 194 195 196 197 198 199  | Next Page >

  • My site works with www.example.com but not example.com

    - by toocool
    this is the site that I am going to develop: http://www.juve-news.com/ it works like this but it doesnt when I try to open it without the www prefix. it give me 400 bad request! I have used other web host before now I am trying a new one and I have to add some dns entries like in the picture here: http://cloudcontrol.com/developers/documentation/add-ons/aliases/ I dont know what I have done wrong. If anyone knows what could be the problem then please give me a tip.

    Read the article

  • Howto fix "[Errno 13] Permission denied" in mailman mailing lists

    - by Michael
    After migrating domains from one plesk server onto another, I got several of those mails every day: (the target mailbox does not exist, so I get those as undeliverable mail bounces) Return-Path: <[email protected]> Received: (qmail 26460 invoked by uid 38); 26 May 2012 12:00:02 +0200 Date: 26 May 2012 12:00:02 +0200 Message-ID: <20120526100002.xyzxx.qmail@lvpsxxx-xx-xx-xx.dedicated.hosteurope.de> From: [email protected] (Cron Daemon) To: [email protected] Subject: Cron <list@lvpsxxx-xx-xx-xx> [ -x /usr/lib/mailman/cron/senddigests ] && /usr/lib/mailman/cron/senddigests Content-Type: text/plain; charset=ANSI_X3.4-1968 X-Cron-Env: <SHELL=/bin/sh> X-Cron-Env: <HOME=/var/list> X-Cron-Env: <PATH=/usr/bin:/bin> X-Cron-Env: <LOGNAME=list> List: xyzxyz: problem processing /var/lib/mailman/lists/xyzxyz/digest.mbox: [Errno 13] Permission denied: '/var/lib/mailman/archives/private/xyzxyz' I tried to fix the permissions myself, but the problem still exists.

    Read the article

  • Tracking Unique site Views for 2012 - Not my website

    - by user580950
    I am in trouble. I placed and advt on a website in 2012 which said he has 950,000 unique visits each month so early in 2012 i advertised with them. The advertised didn't worked out so checked in 2-3 months time and i saw that the unique visitors on their site was 8,000 at that time.I immediately close the account I dont remember which site i was checking the unique visitors.That advt company has filed a dispute against me. So is there any tool that give me stats of 2012 of any website. i tried google trends but it doesnt show statistics .

    Read the article

  • Meta description not displaying in custom site search results page

    - by Stephen Connolly
    We have Google Custom Site Search implemented on our company website. When I'm looking at the results page, I noticed that the Meta Description is not being displayed. It just seems to be reading the links titles from our drop down menu and using this as a description. When I search for the same page via google.com, the meta description is pulled in correctly. Any thoughts why this might be happening. I can't see anything in the Custom Site Search settings.

    Read the article

  • Strange Google Analytics result when new site launched

    - by Howard
    I have a web site which is mainly contains a few pages, and now we revamp-ed a new site which contains several hundred pages. We have Strange Google Analytics result, as follow: Before: Traffic sources (all traffic): 674 Content (all pages, unique PV): 674 After: Traffic sources (all traffic): 291 Content (all pages, unique PV): 1235 As you can see, the unique PV has increased as expected (as we have more pages now and the site is better), but why the traffic sources is lower and has a large gap? Any idea?

    Read the article

  • SQL SERVER – Weekly Series – Memory Lane – #007

    - by pinaldave
    Here is the list of selected articles of SQLAuthority.com across all these years. Instead of just listing all the articles I have selected a few of my most favorite articles and have listed them here with additional notes below it. Let me know which one of the following is your favorite article from memory lane. 2006 Find Stored Procedure Related to Table in Database – Search in All Stored Procedure In 2006 I wrote a small script which will help user  find all the Stored Procedures (SP) which are related to one or more specific tables. This was quite a popular script however, in SQL Server 2012 the same can be achieved using new DMV sys.sql-expression_dependencies. I recently blogged about it over Find Referenced or Referencing Object in SQL Server using sys.sql_expression_dependencies. 2007 SQL SERVER – Versions, CodeNames, Year of Release 1993 – SQL Server 4.21 for Windows NT 1995 – SQL Server 6.0, codenamed SQL95 1996 – SQL Server 6.5, codenamed Hydra 1999 – SQL Server 7.0, codenamed Sphinx 1999 – SQL Server 7.0 OLAP, codenamed Plato 2000 – SQL Server 2000 32-bit, codenamed Shiloh (version 8.0) 2003 – SQL Server 2000 64-bit, codenamed Liberty 2005 – SQL Server 2005, codenamed Yukon (version 9.0) 2008 – SQL Server 2008, codenamed Katmai (version 10.0) 2011 – SQL Server 2008, codenamed Denali (version 11.0) Search String in Stored Procedure Searching sting in the stored procedure is one of the most frequent task developer do. They might be searching for a table, view or any other details. I have written a script to do the same in SQL Server 2000 and SQL Server 2005. This is worth bookmarking blog post. There is an alternative way to do the same as well here is the example. 2008 SQL SERVER – Refresh Database Using T-SQL NO! Some of the questions have a single answer NO! You may want to read the question in the original blog post. I had a great time saying No! SQL SERVER – Delete Backup History – Cleanup Backup History SQL Server stores history of all the taken backup forever. History of all the backup is stored in the msdb database. Many times older history is no more required. Following Stored Procedure can be executed with a parameter which takes days of history to keep. In the following example 30 is passed to keep a history of month. 2009 Stored Procedure are Compiled on First Run – SP taking Longer to Run First Time Is stored procedure pre-compiled? Why the Stored Procedure takes a long time to run for the first time?  This is a very common questions often discussed by developers and DBAs. There is an absolutely definite answer but the question has been discussed forever. There is a misconception that stored procedures are pre-compiled. They are not pre-compiled, but compiled only during the first run. For every subsequent runs, it is for sure pre-compiled. Read the entire article for example and demonstration. Removing Key Lookup – Seek Predicate – Predicate – An Interesting Observation Related to Datatypes This is one of the most important performance tuning lesson on my blog. I suggest this weekend you spend time reading them and let me know what you think about the concepts which I have demonstrated in the four part series. Part 1 | Part 2 | Part 3 | Part 4 Seek Predicate is the operation that describes the b-tree portion of the Seek. Predicate is the operation that describes the additional filter using non-key columns. Based on the description, it is very clear that Seek Predicate is better than Predicate as it searches indexes whereas in Predicate, the search is on non-key columns – which implies that the search is on the data in page files itself. Policy Based Management – Create, Evaluate and Fix Policies This article will cover the most spectacular feature of SQL Server – Policy-based management and how the configuration of SQL Server with policy-based management architecture can make a powerful difference. Policy based management is loaded with several advantages. It can help you implement various policies for reliable configuration of the system. It also provides additional administration assistance to DBAs and helps them effortlessly manage various tasks of SQL Server across the enterprise. 2010 Recycle Error Log – Create New Log file without Server Restart Once I observed a DBA to restaring the SQL Server when he needed new error log file. This was funny and sad both at the same time. There is no need to restart the server to create a new log file or recycle the log file. You can run sp_cycle_errorlog and achieve the same result. Get Database Backup History for a Single Database Simple but effective script! Reducing CXPACKET Wait Stats for High Transactional Database The subject is very complex and I have done my best to simplify the concept. In simpler words, when a parallel operation is created for SQL Query, there are multiple threads for a single query. Each query deals with a different set of the data (or rows). Due to some reasons, one or more of the threads lag behind, creating the CXPACKET Wait Stat. Threads which came first have to wait for the slower thread to finish. The Wait by a specific completed thread is called CXPACKET Wait Stat. Information Related to DATETIME and DATETIME2 There are quite a lot of confusion with DATETIME and DATETIME2. DATETIME2 is also one of the underutilized datatype of SQL Server.  In this blog post I have written a follow up of the my earlier datetime series where I clarify a few of the concepts related to datetime. Difference Between GETDATE and SYSDATETIME Difference Between DATETIME and DATETIME2 – WITH GETDATE Difference Between DATETIME and DATETIME2 2011 Introduction to CUME_DIST – Analytic Functions Introduced in SQL Server 2012 SQL Server 2012 introduces new analytical function CUME_DIST(). This function provides cumulative distribution value. It will be very difficult to explain this in words so I will attempt small example to explain you this function. Instead of creating new table, I will be using AdventureWorks sample database as most of the developer uses that for experiment. Introduction to FIRST _VALUE and LAST_VALUE – Analytic Functions Introduced in SQL Server 2012 SQL Server 2012 introduces new analytical functions FIRST_VALUE() and LAST_VALUE(). This function returns first and last value from the list. It will be very difficult to explain this in words so I’d like to attempt to explain its function through a brief example. Instead of creating a new table, I will be using the AdventureWorks sample database as most developers use that for experiment purposes. OVER clause with FIRST _VALUE and LAST_VALUE – Analytic Functions Introduced in SQL Server 2012 – ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING “Don’t you think there is bug in your first example where FIRST_VALUE is remain same but the LAST_VALUE is changing every line. I think the LAST_VALUE should be the highest value in the windows or set of result.” Puzzle – Functions FIRST_VALUE and LAST_VALUE with OVER clause and ORDER BY You can see that row number 2, 3, 4, and 5 has same SalesOrderID = 43667. The FIRST_VALUE is 78 and LAST_VALUE is 77. Now if these function was working on maximum and minimum value they should have given answer as 77 and 80 respectively instead of 78 and 77. Also the value of FIRST_VALUE is greater than LAST_VALUE 77. Why? Explain in detail. Introduction to LEAD and LAG – Analytic Functions Introduced in SQL Server 2012 SQL Server 2012 introduces new analytical function LEAD() and LAG(). This functions accesses data from a subsequent row (for lead) and previous row (for lag) in the same result set without the use of a self-join . It will be very difficult to explain this in words so I will attempt small example to explain you this function. Instead of creating new table, I will be using AdventureWorks sample database as most of the developer uses that for experiment. A Real Story of Book Getting ‘Out of Stock’ to A 25% Discount Story Available Our book was out of stock in 48 hours of it was arrived in stock! We got call from the online store with a request for more copies within 12 hours. But we had printed only as many as we had sent them. There were no extra copies. We finally talked to the printer to get more copies. However, due to festivals and holidays the copies could not be shipped to the online retailer for two days. We knew for sure that they were going to be out of the book for 48 hours. This is the story of how we overcame that situation! Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Memory Lane, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Browser window size statistics?

    - by Litso
    Hey all, I was wondering, are there any statistics available on what size users have their browser set to nowadays? I know the screen resolutions (we have analytics, which shows those as well) but I doubt a lot of people with 1280*xxx and higher still browse full-screen though. My boss is determined to keep our website 900px wide though, because that way people with 1800*xxx resolutions can have two browser windows next to eachother without having to scroll horizontally. I have never seen anyone browse with two adjacent browser windows like that except here at my current job, so I'm kind of doubting whether this is the best decision or just his personal preference. Anyone that can help out here?

    Read the article

  • google custom search gives different result number for same query

    - by santiagozky
    We are using google custom search and we have found that often the totalResults iterates between two values, even for the same query. The different values can be slightly different or more than double. The parameters I am using look like this: https://www.googleapis.com/customsearch/v1? q=something cx=XXXXXXXXXX lr=lang_en siteSearch=www.mydomain.com start=1 fields=context%2Citems%28fileFormat%2CformattedUrl%2Clink%2Cpagemap%2Csnippet%2Ctitle%29%2Cqueries%2CsearchInformation%28searchTime%2CtotalResults%29%2Cspelling%2FcorrectedQuery key=YYYYYYYYYYYYYYY filter=0 This is problem because of calculating the number of result pages. How can I get the same results for the same query?

    Read the article

  • Why do some bad websites rank well?

    - by BradB
    Consider the following scenario: you are pitching SEO/Website Optimisation to a prospective client and you explain to them the importance of great copy and content, how acquiring links (ethically) can increase page rank, why the quality of the HTML build matters (H1, H2 tags, w3c validation etc), why keyword research is beneficial, you may drop in a few Google Webmaster Guideline or Matt Cutts references to back up your claims and rubbish the "back hat" approach as being no longer effective for good measure. Your advice is ethical and in the eyes of best practices, spot on. Then, the client points out to you some of their long established competitors on Google and you see these competitor websites ranking in the top spots (1 to 3) for medium to highly competitive search phrases that your client wants to compete for. These websites totally contradict your ethical approach and pretty much violate every best practice previously noted. They even out perform other "white hat" competitors who are in accordance with the above guidelines. I experienced this today. One of these well ranking websites had: About six microsites with more or less the same copy and a slightly varied layout Little or not textual content I would almost say duplicate content across the sites, but there was so little of it it could barely qualify for being duplicate All the content in Flash (with a music track that kicked in on each page load, not so much of an SEO issue - but it helps paint the picture) Keyword stuffing behind the Flash file with a bunch of black text on black background in the style of keyword 1 keyword 2,keyword1,keyword 2,keyword 2 keyword 3 and so on... The exact keyword stuffed combination present on every page of the website A bunch of clearly self made links from poor quality forums and directories with little or no Page Rank Links exchanged across the microsites How do you explain your way out of this when this hard evidence is sat in front of you undermining your great pitch?

    Read the article

  • Domains with similar names and legal issues

    - by abel
    I recently purchased one of those domain names like del.icio.us. While registering I found that delicious.com was being used. Argument: I found that delicious.com belonged to the same category as my to-be website. It served premium delicious dishes. Counter Argument: My to-be domain though belonging to the same category, specialized in serving free but delicious dishes or in giving out links(affiliate) to other sites serving premium delicious dishes. Additional Counter Arguments: 1.delicious.com was not in English. 2.the del.icio.us in my domain name though having the same spelling, is not going to be used in the same fashion. For eg.(this may not make sense, because the names have been changed)the d in delicious on my website actually stands for the greek letter Delta(?/d) and since internationalized domains are still not easily typable, I am going for the english equivalent.The prefix holds importance for the theme of the service which my website intends to offer. My Question: Can I use the domain name del.icio.us for my website? How are these kinds of matters dealt? (The domain names used are fictitious. And I have already registered the domain but have not started using it.I chanced upon this domain name because it was short, easy to remember and suited the theme of my website.)

    Read the article

  • MySQL vs. SQL Server Go daddy, What is the difference bewteen hosted DB and App_Data Db

    - by Nate Gates
    I'm using Goddady for site hosting, and I'm currently using MySQL, because there are less limits on size,etc. My question is what is the difference between using a hosted Godaddy Db such as MySQL vs. creating a SQL Serverdatabase in the the App_Data folder? My guess is security? Would it be a bad idea to use a SQL ServerDB thats located in the App_Data folder? Additional Well I am able to create a .mdf (SQL Server DB file) in the App_Data folder, but I'm really unsure if should use that or not, If I did use it it would simplify using some of the Microsoft tools. Like I said my guess is that it would be less secure, but I don't really know. I know I have a 10gb, file system limit, so I'm assuming my db would have to share that space.

    Read the article

  • How do you promote your blog or website?

    - by zcourts
    I tend to get (what I think are good ideas) and I go out and either build software/websites from scratch or use an existing software/tool such as wordpress. But when I'm done, and even though I get a few users that say they really like it, I can't seem to get my apps out there, or rather get a large set of eyes on it. So I'm interested in knowing how others do it. I read people's stories of how they did this amazing thing and within 2-3 months they're getting thousands or hundreds of thousands of users per month. It just seems to be all smoke and mirrors. So how have you done it? Or anyone you know who has... Does everyone throw lots of money into their promotion, something else?

    Read the article

  • Symbolic link not allowed or link target not accessible: /var/www on Ubuntu 11.04

    - by Jamie Hutber
    I am getting a 403 when i access http://mayfieldafc.local/ upon looking in the apache logs i am getting [Wed Nov 16 12:32:59 2011] [error] [client 127.0.0.1] Symbolic link not allowed or link target not accessible: /var/www I have what i believe to be the correct permissions set on /var/www. hutber can create and delete files, hutber being my user. I can also execute as program on this folder. in mayfields vhost its: <Directory /var/www/mayfieldafc/docroot> Options +FollowSymLinks AllowOverride None Order allow,deny Allow from all </Directory> I am pulling my hair out not being able to work on my sites with my work ubuntu install. I know of nothing else that could be effecting this. So any ideas?

    Read the article

  • How to set up a SPF record?

    - by MeltingDog
    The clients on my VPS are all getting spammed. The spam seemingly comes from their own email addresses - it is clear that somehow something got into my VPS and was able to capture all the email addresses that existed and is now using them to send spam. I was advised to set up a SPF record, but I am unsure what this is or how to go about it. After reading, I have figured out how to create one in CPanel, but I cannot find what to do with it now. Do I copy it into somewhere in my DNS records in Zone Management? Can anyone point me in the right direction?

    Read the article

  • What is the best way to construct a "remove multiple items" area (ASP.NET VB) [on hold]

    - by Darkcat Studios
    Lets say for example I have a (variable length) 2 dimensional array of product names and their unique product codes. I can display this list in a datagrid, table etc. (Imagine this as a standard shopping basket type scenario) What I need to do is be able to tick multiple items (?) , then on clicking a submit button, fire an action. The bit im struggling with is how do i: A: programatically display asp:checkboxes for each item (and give them a unique ID) B: know which are ticked on firing the final action (not sure if this question is best suited to the main stack but theres so much activity on there that questions just get lost now!)

    Read the article

  • How Easy is it to Code In-Built Videos?

    - by Alan Parker
    First time poster so please don't bite my head off. Basically, I'm having a site built for me and I don't really know anything about coding but I'm not too sure if I trust my web developer. I asked him recently about adding a feature where I could display built-in videos like the following page - http://www.ejot.co.uk/buildingfasteners.odl and he quoted me quite a high amount for it. I just wanted to double check with you guys whether this is a difficult feature to add in and whether it justifies a reasonable amount of money on top of what I'm already paying him. Thanks in advance for your help, Alan

    Read the article

  • Getting away from a customized Magento 1.4 installation - Magento 1.6, OpenCart, or others?

    - by Phil
    I'm dealing with a Magento 1.4.0.0 Community Edition installation with various undocumented changes to the core (mostly integration with an ERP system), an outdated Sweet Tooth Points & Rewards module and some custom payment providers. It also doubles as a mediocre blogging/CMS system. It has one store each for 3 different languages, with about 40 product categories for a few hundred products. [rant] With no prior experience with any PHP e-commerce systems, I find it very difficult to work with. I attempted to install Magento 1.4.0.0 on my local WAMP dev machine, it installs fine, but the main page or search do not show any products no matter what I do in the backend admin panel. I don't know what's wrong with it, and whatever information I googled is either too old or too new from Magento 1.4. Later I'm given FTP access to the testing server, which neither my manager or I have permission to install XDebug on, as apparantly it runs on the same server as the production server (yikes). Trying to learn how Magento works is torture. I spent a week trying to add some fields into the Onepage Checkout before giving up and went to work on something else. The template system, just like the rest of Magento, is a bloated mishmash of overcomplicated directory structures, weird config xml files and EAV databases. I went into 6 different models and several content blocks in the backend just to change what the front page looks like. With little-to-none helpful and clear documentation (unlike CodeIgniter) and various breaking changes between minor point revisions which makes it hard to find useful information, Magento 1.4 is a developer killer. [/rant] The client is planning to redesign the site and has decided it might as well as move on from this unsustainable, hacky, upgrade-unfriendly, developer-unfriendly mess. Magento 1.4 is starting to show its age, with Magento 1.7 coming soon, the client is considering upgrading to Magento 1.6 or 1.7 if it has improved from 1.4. The customizations done to the current Magento 1.4 installation will have to be redone, and a new license for the Sweet Tooth Points & Rewards module will have to be bought. The client is also open to other e-commerce systems. I've looked at OpenCart and it seems to be quite developer friendly with a fairly simple structure. I found some complaints regarding its performance when the shop has thousands of categories or products, but this is not an issue with the current number of products my client has. It seems to be solid ground for easy customization to bring the rewards system and ERP integration over. What should the client upgrade to in this case?

    Read the article

  • Webmaster Tools, www and no-www, duplicate content and subdomains

    - by Jay
    I have not come to any conclusive answers on the following after many hours of research on many websites to the specific issue that I am trying to figure out. My company has two websites a main one at www.example.com and one at subdomain.example.com which is a subdomain of the first and is our self hosted blog. The way Google sees these with the www or no-www (called naked for now on) is that each of these actually are different when the www or naked version is used/not used in the front of the domain. I completely understand this. It is also advised that both should be set up in the Google Webmaster Tools, which I have done. Correct me if I am wrong on that in regard to having both set up. Now the way it appears is that we can set a preferred domain up in Webmaster Tools only at the root domain level. The subdomain can not have this and actually says the following "Restricted to root level domains only". So it appears that the domain should follow what the root domain says which on our preferred one says to display the www.example.com. and not the naked version. That is one issue I have in that one displays one way and the other displays another. Is it that we have the wrong redirects in place for the subdomain? Another question is does this have any affect on SEO in regards to duplicate content on the web in how we have set this up?

    Read the article

  • why some websites changes their short and user friendly URL to long URL?

    - by diEcho
    Hello All, i wonder why some website changes their short and user friendly url to long url like cricinfo.com ---- espncricinfo.com indiafm.com --- bollywoodhungama.com and many others i have seen i just want to know that what is the exact need of doing that?? is there economical reason or what??i think user dont like to write long website name still i also type indiafm.com and browser automatically redirect the URL. (sorry if tags are wrong) Thanks,

    Read the article

  • How to Fix this specific Google "Fetch as Googlebot" error appearing on my Webmaster Tools?

    - by UXdesigner
    Good day, I'm currently finding out why I have lost all of my website's rank in google. I don't even appear in google results by the domain. But other sites do link me and they appear in the google results. I think it's all about leaving my site two months alone and finding out I had 20k in comment spam, which I completely deleted and fixed with filters and adding a new Disqus comment service. Thing is, I added my site to Google Webmaster Tools and I'm finding out several awful things. For example, when I click in Google Fetch As GoogleBot. I receive this error message below in response to my request. And I don't even know what's the real problem and how to fix it. I simply don't get it. This is what appears: Date: Wednesday, July 20, 2011 9:43:35 AM PDT Googlebot Type: Web Download Time (in milliseconds): 55 HTTP/1.1 403 Forbidden Date: Wed, 20 Jul 2011 16:43:36 GMT Server: Apache Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 248 Keep-Alive: timeout=2, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 403 Forbidden Forbidden You don't have permission to access / on this server. Additionally, a 403 Forbidden error was encountered while trying to use an ErrorDocument to handle the request. Do you guys know anything about this problem ? I need to have Google crawl my site again. I used to have a really nice google result in the past three years. Now, there's nothing. thanks,

    Read the article

  • Different robots.txt for two different domains point to same folder

    - by Ali
    Hi, I have the following two domains: domain.com test.domain.com Both point to same folder which is "public_html". What I want is a different robots.txt file for each domain. So when someone browse domain.com/robots.txt then a different file is shown. And when someone go to test.domain.com/robots.txt then a different file is shown. How can I do this using URL rewriting in .htacces? Thanks

    Read the article

  • Stop Apache serving filetypes

    - by ProfSmiles
    Preferably using .htaccess files, though .conf files are an option, is there any way to stop Apache serving certain filetypes? For example, .db shouldn't be served for obvious reason (privacy and whatnot, etc.), so could I make them show as a 404 but still have them available for my CGI scripts? Putting these sensitive files in a directory other than /public_HTML/ is also an option, though I like having them in the same directory as the scripts for ease of use. Cheers

    Read the article

  • Facebook page design is not working in IE8 [closed]

    - by PrateekSaluja
    Hello Experts, We have designed a face book page.It is working fine in all browser including IE7 but it is not working in IE8.We checked then we got if we run our code outside the face book page it works in IE8 but when we put our code into face book page its not working.Here is the css code what we are using for IE8. <!--[if lt IE 8]> <style> .nv_a { width:90px; height:27px; float:left; text-align:center; padding-top:8px; } .nvt_a { width:66px; height:27px; float:left; text-align:center; padding-top:8px; } .nv_a a { width:90px; height:27px; float:left; padding-top:8px; text-align:center; color:#000; display:inline-block; text-decoration:none; background-color:#e0e0e0; border-top:solid 1px #999; border-left:solid 1px #999; border-right:solid 1px #999; border-bottom:solid 1px #999; } .nv_a a:hover { width:90px; height:27px; padding-top:8px; float:left; color:#000; text-align:center; background-color:#ccc; } .nvt_a a { width:66px; height:27px; float:left; padding-top:8px; text-align:center; color:#000; display:inline-block; text-decoration:none; background-color:#e0e0e0; border-top:solid 1px #999; border-left:solid 1px #999; border-right:solid 1px #999; border-bottom:solid 1px #999; border:1px solid red; } Please help us to solve the issue.

    Read the article

  • Will adding top level directories with similar structure to existing directories change the SEO of my site?

    - by Russell Sims
    I've been pointed this way for SEO related questions and this one has had me pondering for a little while now. I'm recreating a site's structure. The website's content is generated through several feeds and unless I want to place each and every - of the 10,000 odd - venues into their own category manually, I can't avoid categorising each item by using its address. The current the structure looks like this Homepage > region > county > city/town > venue page and the URL looks like domain/region/county/city/venue/ I'm relatively happy to use this structure as it's not too convoluted. However we also promote deals and we also group the venues into their respective franchise, so that leads to URLs such as: domain/groups AND domain/deals My question is: how would the directory structure look with these new additions? Would I have a URL that looks like domain/deals/region/county/city/venue or domain/group/region/county/city/venue and just put a 301 or a canonical link tag on the page to prevent the duplicate pages competing with each other? Am I just worrying about it needlessly and perhaps link straight from domain/deals to the venue page URL domain/region/county/city/venue, this bothers me a bit though as the deals and groups will not be in the breadcrumbs.

    Read the article

< Previous Page | 188 189 190 191 192 193 194 195 196 197 198 199  | Next Page >