Search Results

Search found 9717 results on 389 pages for 'pro'.

Page 169/389 | < Previous Page | 165 166 167 168 169 170 171 172 173 174 175 176  | Next Page >

  • Transition to new site

    - by James Hill
    I'm almost finished rewriting the website for a non-profit organization. The existing site receives ~5,000 a month. The new site is being written in ASP.Net and the existing site is PHP. The current hosting provider does not support .Net hosting, so I'll be switching providers. My question revolves around the transition from the old site to the new. I would really like to get the new site up at the new hosting provider and do thorough testing before changing the DNS records for the domain. Question: How can I put the new site up, test it, make any changes/additions necessary before updating the domain DNS to point to the new IP without Google indexing the content? Also, what SEO repercussions should I be aware of when making such a drastic change to the content that exists under the domain name?

    Read the article

  • Desktop Software to monitor online status of web site and web-based application

    - by pansp
    I'm basically looking for a desktop-based software which can monitor my company's website and the web application's online availability. I know there are few online applications like Uptime Robot which does the same work but I have been asked to find a desktop based software which can monitor running in system tray and notify any down-time. A free software would be great. Any help would be appreciated. Thanks!

    Read the article

  • What alternatives to Animated GIFs are natively supported on all popular modern browsers on tablets, etc. without pluggins?

    - by Clay Nichols
    I was looking for an alternative animated images to Animated GIFS. But per CanIUseit support for APNGs seems to being phased out. And MNG support isn't even listed there and pages about it don't even mention Chrome (suggesting those pages are very very old) Clarification: This is for a web app, so it'll need to support: - Safari on iPad (so can't depend on extensions) - Chrome on Windows and Mac - Safari 6.0+ on Mac - Chrome on Android

    Read the article

  • How do I get categories for a product in Magento [closed]

    - by Joost de Valk
    I'm trying to add product_type to my Magento Google Base output based on the product's categories, but I seem to be unable to. I have the following code: // Get categories from product to include as product_type $categoryIds = $object->getCategoryIds(); foreach($categoryIds as $categoryId) { $category = Mage::getModel('catalog/category')->load($categoryId); $this->_setAttribute('product_type', $category->getName(), 'text' ); } The issue is that it returns all of the categories, not just the ones the product is in. Anyone have a solution?

    Read the article

  • Web Safe Area (optimal resolution) for web app design?

    - by M.A.X
    I'm in the process of designing a new web app and I'm wondering for what 'Web Safe Area' should I optimize the app layout and design. By Web Safe Area I mean the actual area available to display the website in the browser (which is influenced by monitor resolution as well as the space taken up by the browser and OS) I did some investigation and thinking on my own but wanted to share this to see what the general opinion is. Here is what I found: Optimal Display Resolution: w3schools web stats seems to be the most referenced source (however they state that these are results from their site and is biased towards tech savvy users) http://www.w3counter.com/globalstats.php (aggregate data from something like 15,000 different sites that use their tracking services) StatCounter Global Stats Display Resolution (Stats are based on aggregate data collected by StatCounter on a sample exceeding 15 billion pageviews per month collected from across the StatCounter network of more than 3 million websites) NetMarketShare Screen Resolutions (marketshare.hitslink.com) (a web analytics consulting firm, they get data from browsers of site visitors to their on-demand network of live stats customers. The data is compiled from approximately 160 million visitors per month) Display Resolution Summary: There is a bit of variation between the above sources but in general as of Jan 2011 looks like 1024x768 is about 20%, while ~85% have a higher resolution of at least 1280x768 (1280x800 is the most common of these with 15-20% of total web, depending on the source; 1280x1024 and 1366x768 follow behind with 9-14% of the share). My guess would be that the higher resolution values will be even more common if we filter on North America, and even higher if we filter on N.American corporate users (unfortunately I couldn't find any free geographically filtered statistics). Another point to note is that the 1024x768 desktop user population is likely lower than the aforementioned 20%, seeing as the iPad (1024x768 native display) is likely propping up those number (the app I'm designing is flash based, Apple mobile devices don't support flash so iPad support isn't a concern). My recommendation would be to optimize around the 1280x768 constraint (*note: 1280x768 is actually a relatively rare resolution, but I think it's a valid constraint range considering that 1366x768 is relatively common and 1280 is the most common horizontal resolution). Browser + OS Constraints: To further add to the constraints we have to subtract the space taken up by the browser (assuming IE, which is the most space consuming) and the OS (assuming WinXP-Win7): Win7 has the biggest taskbar footprint at a height of 40px (XP's and Vista's is 30px) The default IE8 view uses up 25px at the bottom of the screen with the status bar and a further 120px at the top of the screen with the windows title bar and the browser UI (assuming the default 'favorites' toolbar is present, it would instead be 91px without the favorites toolbar). Assuming no scrollbar, we also loose a total of 4px horizontally for the window outline. This means that we are left with 583px of vertical space and 1276px of horizontal. In other words, a Web Safe Area of 1276 x 583 Is this a correct line of thinking? I'm really surprised that I couldn't find this type of investigation anywhere on the web. Lots of websites talk about designing for 1024x768, but that's only half the equation! There is no mention of browser/OS influences on the actual area you have to display the site/app. Any help on this would be greatly appreciated! Thanks. EDIT Another caveat to my line of thinking above is that different browsers actually take up different amounts of pixels based on the OS they're running on. For example, under WinXP IE8 takes up 142px on top of the screen (instead the aforementioned 120px for Win7) because the file menu shows up by default on XP while in Win7 the file menu is hidden by default. So it looks like on WinXP + IE8 the Web Safe Area would be a mere 572px (768px-142-30-24=572)

    Read the article

  • Storing User-uploaded Images

    - by Nyxynyx
    What is the usual practice for handling user uploaded photos and storing them on the database and server? For a user profile image: After receiving the image file from user, rename file to <image_id>_<username> Move image to /images/userprofile Add img filename to a table users containing their profile details like first_name, last_name, age, gender, birthday For a image for a review done by user: After receiving the image file from user, rename file to <image_id>_<review_id> Move image to /images/reviews Add img filename to a table reviews containing their profile details like review_id, review_content, user_id, score. Question 1: How should I go about storing the image filenames if the user can upload multiple photos for a particular review? Serialize? Question 2: Or have another table review_images with columns review_id, image_id, image_filename just for tracking images? Will doing a JOIN when retriving the image_filename from this table slow down performance noticeably? Question 3: Should all the images be stored in a single folder? Will there be a problem when we have 100K photos in the same folder? Is there a more efficient way to go about doing this?

    Read the article

  • Pros and cons of creating a print friendly page to remove the use of pdfs?

    - by Phil
    the company I work for has a one page invoice that uses the library tcpdf. they wanted to do some design changes that I found are just incredibly difficult for setting up in .pdf format. Using html/css I could easily create the page and have it print very nicely, but I have a feeling that I am over looking something. What are the pros and cons of setting up a page just for printing? What are the pros and cons of putting out a .pdf? I could also use the CSS inline so that if they wanted to download it and open it they could.

    Read the article

  • Same sitemap submitted for .com and .co.uk domain

    - by Dean
    Not to sure why I did this. But I submitted the same sitemap for our .co.uk and .com domain. Looking to put the .com domain on different hosting and create a new site for international customers using .com domain. Should I remove all urls in google webmasters for the .com domain, guessing this won't have a negative effect on .co.uk stuff and add robot.txt to make sure the .com domain is not crawled? Thanks

    Read the article

  • Unimportant words on page being seen as keywords, due to repetiveness

    - by user21100
    I have a list of products on my site, and each product has a number of descriptors such as features, price, etc. Next to each product, I list the 10 features, with a graphical icon which lets the user know whether the product has that particular feature or not. In all, I have about 230 products, and I have to add the same list of features to describe each product, so you can see the enormous redundancy here of these "feature names". These "feature names", ex., "water proof", are not important keywords at all, yet due to the sheer volume of these words, Google is seeing them as my most important keywords. Is there any way to get around this, or to tell the bots to place (less) emphasis on these repetitive words, and not view them as important keywords?

    Read the article

  • How to deal with overly aggressive "Link Take Down Demands"?

    - by Eoin
    I've been receiving a large number of emails recently requesting I clean from link spam from my forum. Initially the emails were very polite and professional, and I was happy to remove the links. Recently the email have gotten very abrasive, here is a particularly rude example: From: [email protected] To: [email protected] Hi, This is the second time we are reaching out to you regarding your link to our site hxxp://www.company-two.com from hxxp://www.my-forum.com/some-topic-id. We really do need to remove this link. We have to report to Google any link we were unable to remove, and I wouldn't want to have to include your site in the list. Could you please remove our link from this page and any other page on your site? Thank You, Name Changed Behind the superficial pleasantries I feel there is some very real maliciousness. Note the email address, DMCA Violations, I don't see how the DMCA is involved here, except as a word which tends to strike fear in many people. Also relating to the email address, it doesn't match the company being linked to at all. How am I to trust they are truely operating on behalf of company-two when they don't even use one of it's email addresses. My email is hidden by privacypost. While a service with legitimate uses, I feel it's highly unprofessional for communications between to companies. The claim "This is the second time..." Every email I've received has started like this, but a check of my spam filters has never revealed a 1st mail. Initially I gave them the benefit of the doubt, by now though it's clear this is a cheap ploy to start me off on the defensive. And finally worst of all- the threats of reporting me to Google if I don't do everything they ask. I sent a polite reply asking for more information. I have no idea if the email address was even valid but I never received any response. Much later I got this followup mail From: [email protected] To: [email protected] Hi, This is the final time we are reaching out to you regarding your link to our site hxxp://www.company-two.com from hxxp://www.my-forum.com/some-topic-id. We will soon be reporting to Google any link we were unable to remove, and currently your site will have to be on the list. Could you please remove our link from this page and any other page on your site? I appreciate your urgent attention to this matter. Thank You, Name Changed This time the from address was more personal, though still not obviously connected to the spammed company. Lets be honest, I don't for one second believe that the companies were the victim of a 3rd party spammer as they claim. The links in questions were generated well over a year ago, and I firmly believe the companies were directly responsible for the spam links in question, a type of spam that has plagued my forum. Now they have the audacity to demand I spend my time cleaning up their mess, using threats to ensure they get their way. Have recent changes in Googles algorithms meant all the cash they spent spamming the web has now turned into a liability? If so I can see why these companies are all of a sudden running scared. Frankly, cleaning up my forum is a good things, but the threats they are using sickens me. So my question here is specifically about the threats: Are they vaild, and would such reports to Google destroy my page rankings? Is there a way I can report this abusive behaviour to Google?

    Read the article

  • Tracking first visit date in Google Analytics

    - by dusan
    I want track a site's visitor retention using Google Analytics, to see if unregistered users are returning to it, within a time frame of 2+ months from now. This blog post seems to be on the right track, but I want to track unregistered users, so I don't have a "join date" or similar variable at my disposal. This other blog post suggests using all 5 GA custom variables, using the first variable slot on the first week, variable 2 on week 2, etc. This method will allow me to track 5 weeks of visitors. I want to track more than 5 weeks of visitors, so I was thinking on using two custom variables in GA: visitor's first visit date, and visitor's last visit date. How I can save the first visit date? Because if I save another value in the same slot the old value will be overwritten, and I don't know how reliable is to save that variable conditionally (reading the __utmv cookie to check whether the "first visit date" is set, if it isn't set I save the current date)

    Read the article

  • Does Jquery and Mootools usually have conflict if both are used on a webpage? [migrated]

    - by Charming Prince
    I have this website am designing, i tried using mootools 1.31 to animate some of the div boxes when clicked or when the mouse hover rounds it, to shows the content. the thing is that it doesn't seem to work on the webpage, but if i try the same script on a blank webpage it works, am thinking probably it's because i have Jquery 1.52 on the same page and maybe both scripts are conflicting with each other because, if i remove the Jquery, the Mootools works. What should be my option, because i need the Jquery to do some validations for me, so i can't remove it completely. Here are the codes <script> //-vertical var mySlide = new Fx.Slide('test'); $('slidein').addEvent('click', function(e){ e = new Event(e); mySlide.slideIn(); e.stop(); }); $('slideout').addEvent('click', function(e){ e = new Event(e); mySlide.slideOut(); e.stop(); }); $('toggle').addEvent('click', function(e){ e = new Event(e); mySlide.toggle(); e.stop(); }); $('hide').addEvent('click', function(e){ e = new Event(e); mySlide.hide(); e.stop(); }); </script> here's the HTML <html> <h3 class="section">Fx.Slide Vertical</h3> <a id="slideout" href="#">slideout</a> | <a id="slidein" href="#">slidein</a> | <a id="toggle" href="#"> toggle</a> | <a id="hide" href="#">hide</a> <div id="test"> Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad mi nim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum. </div> Here's the CSS #test { background: #222; color: #fff; padding: 10px; margin: 20px; border: 10px solid pink; } #test2 { background: #222; color: #fff; padding: 10px; margin: 20px; border: 10px solid pink; } Am using the exact same code supplied by Mootools in their own example, if i do this on a blank webpage it works but incorporated into my own webpage, it doesn't, and my own page just have the script tag of the Jquery in the head section of the HTML.

    Read the article

  • PageRank sharing between domains?!

    - by Senthil
    I own three domains, say.. example.com, example.in, example.co.in I have bought the .in and .co.in TLDs only to protect the brand. But I have this question: If I make the other two also point to my hosting so that regardless of which one the user types, they are taken to the same website, will the PageRank be split into three and will each domain have one third the actual PR value? What should I do with the other two domains? Where should I point them to, if I don't intend to use them at all (i.e., what should I give in place of the ns1.myprovider.com, ns2.myprovider.com etc..?)

    Read the article

  • How to overcome fear to draw web-site template? [closed]

    - by Ricky
    I have some problem with web-site design. I can understand CSS, I use CSS grid frameworks, etc. But I can't realize my ideas. If I have some template, I can to translate it to HTML and CSS, but I don't understand how to begin with web-design for web-site from scratch. I think that I was afraid to draw or can't understand first steps. How to overcome this fear? May be some specific techniques like mind maps to streamline operations? or something else..? Or you can share what you are doing when begin. If any one have some ideas.. Thank you!

    Read the article

  • Can't see like plugin iframe on (at least) some browsers [migrated]

    - by MEM
    Not sure why. I grabbed the code from: http://developers.facebook.com/docs/reference/plugins/like/ And as stated there, we can read: "href - the URL to like. The XFBML version defaults to the current page. Note: After July 2013 migration, href should be an absolute URL" So I did. <body> <div> <iframe src="//www.facebook.com/plugins/like.php?href=https%3A%2F%2Fwww.facebook.com%2Fprojectokairos&amp;width=100&amp;height=21&amp;colorscheme=light&amp;layout=button_count&amp;action=like&amp;show_faces=false&amp;send=false" scrolling="no" frameborder="0" style="border:none; overflow:hidden; width:100px; height:21px;" allowTransparency="true"></iframe> </div> </body> Could this be related with the fact that the page is unpublished? I hope not because I do need to place the button here and there on several pages before the FB page goes live.

    Read the article

  • Bookmark login_email at new PayPal URL [closed]

    - by Jonna Stevens
    I have used this Bookmark in Firefox so that my email would be autofilled and I only had to write in my password. PayPal has recently changed its login URL. Has anybody figured out a method to achieve this with the new URl ? Old URL: https://www.paypal.com/es/cgi-bin/webscr?cmd=_login-run&login_email=myemail%40myemail.com New URL (not working): https://www.paypal.com/es/webapps/mpp/home-merchant?login_email=myemail%40myemail.com

    Read the article

  • SEO tool is telling me title, description and keywords don't exist, but they do. Where is the problem?

    - by DaveDev
    I'm using the following tool to analyse how 'optimal' a site that I'm working on is for search engines: http://tools.seobook.com/general/spider-test/ I enter the URL for the site - http://ftmsuat.moneymate.com - into the search bar, and it returns a breakdown of the contents of the page. I'm a little confused by what I see though. According to the results, the page doesn't have a title, description or keywords. But if you check the source of the page, those elements are definitely there. So I'm wondering now, which is wrong? seobook.com or my page?

    Read the article

  • Possible for using a surrogate to buy a .it domain?

    - by Matthew Reinbold
    I'm a US citizen interested in buying an Italian TLD (*.it). However, those domains can only be registered by EU citizens or residents, or businesses with a registrant who is an Italian citizen and resident. Are there companies that provide a 'surrogate' like service? They fulfill the requirements for registration but I can administer the domain properties? What are they and what can I expect to pay for the middleman? Or am I a horrible person for even considering 'circumventing' the intent of the restriction?

    Read the article

  • Gaming Community CMS, with forum integration [closed]

    - by Tillman32
    Possible Duplicate: Which Content Management System (CMS) should I use? I've had a simple website that I coded myself for a while now, the site is a gaming community. It's very forum and news driven. It was a HORRIBLE idea to take on coding this thing myself. Although we've used it for about a year now, we're just getting too big, and I need to streamline our work. I need writers to post news, etc. I've been doing it through code. ( A year ago I thought it would be a cool idea ) Anyway, I've been messing with just about every CMS out there, and I'm struggling to get something that I really like. The main issue I'm facing, is a good news system, and good forum integration. I'm sort of picky when it comes to looks, its a curse. Reading on here, I see a lot of people saying Drupal is the best for the 3 things I need, community interaction, and forums. I think the main issue that I ran into with drupal, was ease of use, and themes. I am not a web designer, and I need a good theme. For an idea of what I'm looking for, go check out http://www.clgaming.net, they have forums integrated, a nice news area on home page/news section, and nice user accounts. It looks very professional, and I doubt I'll get close to that with a free theme, but their functionality is exactly what I need. Any ideas would be greatly appreciated.

    Read the article

  • One to many problem with implementing 301 redirect after changed urls

    - by user16136
    I have a problem. I had an old dynamic url which I have now split into multiple static urls. e.g. www.mydomain.com/product.php?type=1&id=2 www.mydomain.com/product.php?type=2&id=3 www.mydomain.com/product.php?type=2&id=4, etc which I have changed to something like www.mydomain.com/electronics/radio www.mydomain.com/electronics/television www.mydomain.com/mobile/smartphone, etc. Google has previously indexed the dynamic urls and search results show the old urls. I want search to point to the new urls. I have kept the old url active, so both urls work. How can I set up a 301 redirect in this case? I run IIS and it only allows a page to be redirected to 1 url. Should I deactivate the old dynamic url? In that case I lose all the previous seo rankings..

    Read the article

  • SEO for images: can I use a different (cookieless) domain?

    - by Oliver
    Hello, We want to increase the value of some of our important images by means of SEO, and we want to start serving them from a different, i.e. cookieless, domain. We want to go from http://www.example.com/images/1234.jpg to http://www.example.com/germany/bavaria/landscape.jpg which can easily be done via URL rewriting. Then on the other hand, we would like to serve the image from a completely different domain, let's say http://www.examplestatic.com/germany/bavaria/landscape.jpg, to save the overhead of sending the cookie from www.example.com. Somehow I feel that this is not a good idea because I move the image away from the content by putting it on a different domain. Can anyone shed some light on this problem? Naturally, I would just use a different subdomain, e.g. img.example.com, but we already use subdomains for languages and our cookies are valid for all subdomains of example.com, so this won't help. I'd really appreciate any hints. Cheers,

    Read the article

  • Site Description h2 vs p

    - by user1010609
    I tend to follow this html structure while creating new site on my main page: <div class="header"> <img alt="keyword" title="keyword logo" src="keyword.png" /> <div> <h1>keyword</h1> <p><b>keyword + hierarchy keywords</b></p> </div> </div> As you can see Im using <p><b></b></p> to put short description of the site in it but I was wondering If maybe h2 would be better to use here?

    Read the article

  • Will multivariate (A/B) testing applied with 302 redirects to a subdomain affect my Google ranking?

    - by Lior
    I want to do an A B test of an entire site for a new design and UX with only slight changes in content (a big brand site that has good Google rankings for many generic keywords. My idea of implementation is doing a 302 redirect to the new version (placing it on www1 subdomain) and allowing only user agents of known browsers to pass. The test version will have disallow all in the robots text. Will Google treat this favorably or do I have to use Google Website Optimizer (which will give me tracking headaches)?

    Read the article

  • Wierd Results A/B Test in Google Website Optimizer

    - by Yisroel
    I set up a test in Google Website Optimizer that has a 3 variations - original (A), B, and C. In order to further validate the results of the test, I added a variation C that is exactly the same as the original. And thats where the results get weird. 6 days in to the test, the best performing variation is C. It outperforms the original by 18.4%! How is that possible? Do I now discount the results of this test entirely?

    Read the article

  • Moving one site in Webmaster Tools to more than one site [closed]

    - by Towhid
    Possible Duplicate: How should I structure my urls for both SEO and localization? I have a Question and Answer site about immigration. now I divided it into 2 sites: mysite.co.uk about immigration to UK mysite.com with sub domains for every country, Like: australia.mysite.com , sweden.mysite.com , ... now I had moved All the content from my first site into .co.uk and .com site and it's sub domains to fill theme. I now that Google will detect my new 2 sites as duplicate of first on and it is very bad for SEO. and I don't think Google webmaster tools has a tool for it. so Please Guide me how to fix this problem.

    Read the article

< Previous Page | 165 166 167 168 169 170 171 172 173 174 175 176  | Next Page >