Search Results

Search found 9724 results on 389 pages for 'pro zeck'.

Page 165/389 | < Previous Page | 161 162 163 164 165 166 167 168 169 170 171 172  | Next Page >

  • how to get new site indexed by alexa

    - by JohnMerlino
    When I search my site on alexa, it says "Alexa Traffic Rank: No Data". So I google the issue. I come across this page: http://www.loudable.com/my-website-data-not-showing-in-alexa-get-your-website-crawled-by-alexasolution.html It says to get site indexed, click Crawl my site on the webmasters page. However there is no longer a link that says "Crawl my site". So as of now, does anyone know how to get site indexed by alexa so that my traffic rank will display on alexa index?

    Read the article

  • I Purchased a Domain that Previously had a Google Apps Account. How Do I Re-Create The Google Apps Account or Take Ownership of it?

    - by jmort253
    I recently purchased a domain name. We'll call it example.com. The previous owner of the domain had a Google Apps account. Now that I own the domain, I want to create a Google Apps account so I can point the domain www.example.com to one of my Google App Engine domains. We'll call it application.appspot.com. Google App Engine won't allow me to add the domain without verifying ownership by creating a Google Apps account or logging into Google Apps, but I don't have access to the old Google Apps account. We've tried going to this address to take ownership: https://www.google.com/a/cpanel/example.com/ResetAdminPassword?c=LONG_KEY&hl=en_US We retrieved a new password, but it wouldn't tell us what the login name is. How do you find out the login name?

    Read the article

  • Are the famous websites handmade? [closed]

    - by Mithun Chuckraverthy
    I'm a newbie in web designing. I always wanted to build a professional quality website by myself. So, I started learning HTML/XHTML and CSS for presentation; and, JavaScript and PHP/MySQL for scripting. I wonder, would the developers of famous websites design them by hand? Or, have they found out any better idea of using softwares? If so, can you tell me what are they? (By the word famous, I mean any websites that are liked by millions of people all over the world. Like: Google, Facebook etc.) Thanks in advance!

    Read the article

  • Images not indexed by google since moving to cdn

    - by dfunkydog
    Last week I moved all the images on coffeeandvanilla.com to a cdn( maxcdn.coffeeandvanilla.com ). The problem I'm having is that although the sitemap—generated by yoast wordpress seo plugin—points images to the correct location, google only indexes[sic] images from the category and page site maps but 0 images from the posts sitemap( see screenshot https://dl.dropbox.com/u/4635252/sitemap.png ) This website has been doing quite well with google image-search before the change, visits from google image search have dropped from ~200/day to 11 yesterday Here is an example entry from the generated posts.xml sitemap http://pastebin.com/vcMRf9VW Can anyone suggest where the problem lies? Why have I lost all my google image juice? Should I just wait some more, how long before really worrying?

    Read the article

  • In practice, what are the key differences between Heroku and webfaction? [closed]

    - by jdotjdot
    I've been building and hosting webapps, mainly in Django and Flask, for some time now. Mainly, I've been hosting them on Heroku, because of the free tier and the ease of git-enabled application updating. I have seen that a lot of Django users prefer Webfaction. I looked through their offerings, and they seem to me like a standard web hosting service. Questions: Why might be webfaction considered a good hosting service for Django apps? If Heroku is generally called a "Platform-as-a-Service," what does that make Webfaction? Does it have any important similiarities/distinctions from Heroku that I might somehow be missing?

    Read the article

  • Is it possible for a web-server to send more files than requested for, and have the browser accept them?

    - by Osiris
    I've created a basic web server for a school project, and it serves static content without a problem. I thought of having the server parse all htm/html files for links to .js/.css/image files, and send these files to the client without these files being requested by the client later. eg. The browser requests: index.htm The server responds with intex.htm and image.jpg I modified the server to send two distinct http responses for a "GET /index.html HTTP1.1" (one for the html page and one for the image), but the browser ended up requesting the image when it was good and ready. Is there any way to bypass this? (use a multipart response, perhaps) Will these files be accepted by most browsers, or will they be rejected for security reasons?

    Read the article

  • At what point should data be sent back to server?

    - by whamsicore
    A good example would be the stackexchange "rate" button. When a post is upvoted the arrow changes color immediately. However there is a grace period for one to edit one's vote decision (oops! voted by mistake?). Is the upvote action processed immediately or does is only process after a set time period, or when the user leaves the page? How exactly is this rating processed? What is the standard for handling dynamic page edits (e.g. stackexchange rating, facebook posts?)

    Read the article

  • Why subdomains of Blogspot/WordPress like sites are treated as different domains or sites?

    - by Thedijje
    As I know, maps.google.com or mail.google.com all comes under the same domain and its all are subdomain. Entire web treats these subdomain as the part of main domain and they have same Alexa rank, PageRank and all. But in another hand, take a look on blogspot.com/wordpress.com/webs.com; these are different sites but blogs or websites under those domains are treated as different sites. Its new URL, all have different PageRank and Alexa rank as well. Tts about millions of subdomains under those few domain, have almost similar IP address, hosting and CMS, still why they are called different domains?

    Read the article

  • How to prevent a specific website from linking to our domain?

    - by Edward
    We have a landing page which is used only for running an ad campaign. There is a website that has found this link somehow and is linking to it. I've been told by marketing they don't want that website linking to the landing page. How can I prevent a specific website from linking to our domain? I don't want to block all websites from linking to it, just this specific one. Is there solution something to do with .htaccess? If so, please provide an example of doing this or a link to example because I've been unable to find one.

    Read the article

  • Enabling/disabling proftpd accounts with PHP and WHM

    - by Brett G
    I have a VPS with WHM/CPanel which is being used just by me. It's utilizing proftpd. I'd like to, via a PHP script, disable/enable a specific FTP account. I've done this by having PHP call a bash script which removes/adds the user account line to /etc/proftpd/USERNAME password file. However, in order to do this I have to give other write rights to /etc/proftpd/USERNAME. This isn't ideal, and I'd be willing to do it another way. It also seems like WHM is automatically resetting these permissions on a regular basis. Does anybody have any ideas on a better way to deal with this?

    Read the article

  • Will search engines discover that our old pages have been 301 redirected if there are no more links to them in the old site?

    - by Obay
    We've moved our website to a new domain. Thousands of our pages come from one PHP file in the old site (e.g. oldsite.com/news.php?id=<id>). So we added some code in news.php file to do a 301 redirect to the specific corresponding news article in the new website (newsite.com/news/<id>). We have not yet done a 301 redirect for the root of the old site (so we could display a notice to our users that we've moved), but all links inside it are already 301 redirected. My concern is that, when Google crawls our old website, it will no longer be able to find the old news articles and discover that they have been 301 Redirected -- is this correct? If so, does that mean our PageRank won't be carried over to the new site? I've also read that we would need to create a sitemap for the new site. Is it possible to indicate in the sitemap the old and new locations of specific pages? Because if not, how will Google know? (I'm not sure change of address in Webmaster Tools would be specific enough).

    Read the article

  • Google Analytics Dashboard: week-by-week view

    - by Silver Dragon
    Setting up Google Analytics Dashboard allows webmasters to get a weekly progress report of marketing achievements & keep a finger on what's going on at web properties. However, by default, the dashboard always displays a day-by-day report, which isn't actionable in markets, where meaningful improvements happen on a week-by-week, or month-over-month basis. Is there any way the default view (and reports sent out via email) can be set to display week-level resolution, as opposed to day-level resolution? (ie, repro: analytics - site - Standard reports - audience - overview - right side of the window, click "weeK") Many thanks!

    Read the article

  • Openx api Advertiser statistics call [migrated]

    - by Sameer
    I am trying to write a jsp application which will establish the xmlrpc connection with openxapi and return the values. I am using openxapi v1 Here I get the dates through a datepicker and then convert to date format: `String dateStr = request.getParameter("datum1"); SimpleDateFormat formater = new SimpleDateFormat("dd-MM-yyyy"); Date result1 = formater.parse(dateStr); String dateStr2 = request.getParameter("datum2"); SimpleDateFormat formater2 = new SimpleDateFormat("dd-MM-yyyy"); Date result2 = formater2.parse(dateStr2);` Then I call the service provided by openxapi (Advertiser Daily Statistics) (sessionID, advertiserID, from date, to date) Object[] objects1=(Object[])client.execute("advertiserDailyStatistics", new Object[]{sessionId,3,result1,result2});

    Read the article

  • Can I include a robots meta tag outside of the head in HTML snippets indeded to be SSIed?

    - by Dan
    I have a number of files in my site which are not intended for independent viewing, but rather to be AJAXed into content within the site. They obviously don't meet HTML standards (no body, head, etc.) as independent entities. I would like to prevent search engines from indexing these pages, but do not have access to /robots.txt (which would be much more ideal). My question is, could I include the following at the top of these partial HTML files and get the desired results? <meta name="robots" content="noindex, noarchive"> I guess there are two parts to this question. Will this cause any rendering issues in any browsers? Will search engines (at least Google & Bing) interpret this as intended?

    Read the article

  • How to track in Google Analytics registrations come from Google AdWords ads?

    - by automatix
    I created a campaign in Google AdWords and some ads in it and gave them URLs like mydomain.tld/registration/?utm_campaign=mycampaing&ad=x mydomain.tld/registration/?utm_campaign=mycampaing&ad=y mydomain.tld/registration/?utm_campaign=mycampaing&ad=z All ads lead to the registration page. A registration is a visit of the page mydomain.tld/registration-complited/?user={ID} So I can track the registrations in Google Analytics. I just go to Behavior -> Site Content -> All Pages and filter the pages to registration-complited. But how can I see, how many and which users have registered, after they came from an ad of a campaign, e.g. utm_campaign? And how can I also track this for a sigle ad of the campaign, e.g. x?

    Read the article

  • Please Help - PHP Form, when no text is entered [migrated]

    - by Joe Turner
    I'm creating a mobile landing page and I have also created a form that allows me to create more, by duplicating a folder that's host to a template file. The script then takes you to a page where you input the company details one by one and press submit. Then the page is created. My problem is, when a field is left out (YouTube for instance), the button is created and is blank. I would like there to be a default text for when there is no text. I've tried a few things and have been struggling to make this work for DAYS! <?php $company = $_POST["company"]; $phone = $_POST["phone"]; $colour = $_POST["colour"]; $email = $_POST["email"]; $website = $_POST["website"]; $video = $_POST["video"]; ?> <div id="contact-area"> <form method="post" action="generate.php"><br> <input type="text" name="company" placeholder="Company Name" /><br> <input type="text" name="slogan" placeholder="Slogan" /><br> <input class="color {required:false}" name="colour" placeholder="Company Colour"><br> <input type="text" name="phone" placeholder="Phone Number" /><br> <input type="text" name="email" placeholder="Email Address" /><br> <input type="text" name="website" placeholder="Full Website - Include http://" /><br> <input type="text" name="video" placeholder="Video URL" /><br> <input type="submit" value="Generate QuickLinks" style="background:url(images/submit.png) repeat-x; color:#FFF"/> </form> That's the form. It takes the variables and post's them to the file below. <?php $File = "includes/details.php"; $Handle = fopen($File, 'w'); ?> <?php $File = "includes/details.php"; $Handle = fopen($File, 'w'); $Data = "<div id='logo'> <h1 style='color:#$_POST[colour]'>$_POST[company]</h1> <h2>$_POST[slogan]</h2> </div> <ul data-role='listview' data-inset='true' data-theme='b'> <li style='background-color:#$_POST[colour]'><a href='tel:$_POST[phone]'>Phone Us</a></li> <li style='background-color:#$_POST[colour]'><a href='mailto:$_POST[email]'>Email Us</a></li> <li style='background-color:#$_POST[colour]'><a href='$_POST[website]'>View Full Website</a></li> <li style='background-color:#$_POST[colour]'><a href='$_POST[video]'>Watch Us</a></li> </ul> \n"; fwrite($Handle, $Data); fclose($Handle); ?> and there is what the form turns into. I need there to be a default link put in incase the field is left blank, witch it is sometimes. Thanks in advance guys.

    Read the article

  • Perl script rendered in browser as code through symlink - fine when accessed directly

    - by John Dittmar
    I have a Rails 4 app that has some views that post to Perl cgi scripts. The perl scripts are accessed via a symbolic link to a folder called "cgi-bin". When I navigate to a perl script through the symbolic link they are rendered as text instead of executed (ie: localhost:3000/cgi-bin/test.cgi), however when I access them directly they execute without issue (ie. localhost/path/to/cgi-bin/test.cgi). I am using apache2 on os x. In the directory localhost/path/to/ I have an .htaccess file that contains the following: # General Apache options AddHandler fastcgi-script .fcgi AddHandler cgi-script .cgi Options +FollowSymLinks +ExecCGI I have the exact same lines in the .htaccess file that I have in localhost:3000/ I have also uncommented the AllowOverride all in httpd.conf. The are no errors in apache's error log. When I access the direct link to test.cgi a new line is appended to apache's access log, when I access the script through the symbolic link (and it is rendered as text), there is no line appended to the access log. Any idea why this error occurs? This setup worked fine in a previous version of rails of OS X, but recently I upgraded to Mavericks and figured I should update the Rails application to v4.0 as well.

    Read the article

  • Strategy for hosting 700+ domains, each with static HTML site

    - by jonschlinkert
    I have a portfolio of more than 700 domain names, and ideally I'd like to put up a single-page HTML/CSS/JavaScript webpage for each domain. Is there a system/strategy/workflow that will allow me to: Automate the deployment of new websites, quickly and easily without having to manually initiate each new website in an admin panel. For instance, I've seen dropbox-based solutions that claim to make it simple to setup new websites on your dropbox account, but you still have to set each one up in an admin interface first. It would be so much easier to have a folder naming convention that allowed the user to easily clone/copy/duplicate sites inside their Dropbox App folder (https://www.dropbox.com/developers/blog/23) to create new ones. Sounds interesting, however... It's easy to managing CNAMEs on the registrar-side, is there a way to quickly associate CNAMEs with new websites, maybe gh-pages-style (https://help.github.com/articles/setting-up-a-custom-domain-with-pages)? With GitHub's gh-pages, all you have to do is drop a file called CNAME into your repo, with the domain name you want associated with the repo inside the file. gh-pages isn't a good solution for what I'm doing though unfortunately. I'm also a front-end developer, specializing in rapid web development and "front-end build systems", so I building and maintaining static assets for hundreds of sites is no problem. It's the hosting-side that I really struggle with. Any suggestions?

    Read the article

  • Displaying the same page, no matter what URI

    - by jgauffin
    We have moved a webapplication and would like to display a message in the old IIS. Let's say that the application was in http://oldserver/appname/. How do I make sure that our moved.html is displayed to the user no matter which uri the user browsed in to (in that virtual folder)? http://oldserver/appname/some/path.aspx --- should display http://oldserver/appname/moved.html http://oldserver/appname -- should display http://oldserver/appname/moved.html

    Read the article

  • SEO: disallowing Google from indexing forms in iframes or not?

    - by Marco Demaio
    I usually place forms in iframes (i.e. order form, request assistance form, contact forms, ect.). Just the forms, I never place other contents or pages in iframes. From a SEO point of view, would you exclude forms from being indexed/crawled by Google or not? I mean my forms hardly ever contains keyword/keyphrases, moreover I obviously place empty title/meta description tags in pages shown in iframe to display forms, cause those titles are never displaied in browser title bar. So I'm wondering what's the point of letting Google index them? Moreover I think these form pages might suck out PR from all other pages that are more valuable for SEO. If your answer is "yes I would exclude them form indexing" would you simply use robots.txt to exclude them? Thanks!

    Read the article

  • Which of these URL scenarios is best for big link menus? [seo /user friendly urls]

    - by Sam
    Hi folks, a question about urls... me and a good friend of mine are exploring the possibilities of either of the three scenarios for a website where each webpage has a menusystem with about 130 links.: SCENARIO 1 the pages menu system has SHORT non-descriptive hyperlinks as well as a SHORT canonical: <a href:"design">dutch design</a> the pages canonical url points to e.g.: "design" OR SCENARIO 2 the pages menu system has SHORT non-descriptive hyperlinks wwith LONG canonical urls: <a href="design">dutch design</a> the pages canonical url points to: dutch-design-crazy-yes-but-always-honest OR SCENARIO 3 the pages menu system has LONG descriptive hyperlinks with LONG canonical urls: <a href="dutch-design-crazy-yes-but-always-honest">dutch design</a> the pages canonical url points to: dutch-design-crazy-yes-but-always-honest Currently we have scenario 2... should we progress to scenario 3? All three work fine and point via RewriteMod to the same page which is fetched underwater. Now, my question is which of these is better in terms of: userfriendlyness (page loading times, full url visible in url bar or not) seo friendlyness (proper indexing due to the urls containing descriptive relevant tags) other concerns we forgot like possible penalties for so many words in link hrefs?? Thanks very much for your suggestions: much appreciated!

    Read the article

  • Hosting advice for a write-heavy dynamic website

    - by Rahul Rawat
    I have built a website using PHP and MySQL and now I am looking for a hosting service. I am expecting about a 1000 users registering and about 5-10k pageviews/day in a week's time. So which host should I opt for? It will let users submit contents of type blobs and submit around 10 pictures per users. I hope that traffic will increase so can justhost's or bluehost's shared hosting serve that purpose or should I go for more dedicated ones. Basically the site is write heavy and there are average 2-3 MySQL queries per page and it is quite dynamic. So depending on these requirements which web hosting will be optimal for me.

    Read the article

  • Subdomain still times out after set up a month ago

    - by user8137
    I'm a newbie on this and this has probably been asked already but the subjects online were close but too vague in there answers so I've probably really messed this up. I would really appreciate specific step by step instructions. This is what I'd like to do: use the subdomain www.high-res.domain.com to be accessed by external customers with specific permissions to access the site (like ftp). We use Network Solutions to house domain.com. We recently added a new ip address to point to www.high-res.domain.com. I gave the ip address to the company that hosts our website. I pinged www.high-res.domain.com and it points to the correct ip address but still times out. It’s been a few weeks now and when you ping it, it still times out. C:ping XXX.XXX.X.XXX Pinging XXX.XXX.X.XXX with 32 bytes of data: Request timed out. Request timed out. Request timed out. Request timed out. Ping statistics for XXX.XXX.X.XXX: Packets: Sent = 4, Received = 0, Lost = 4 (100% loss). Tracert times out as well. I even went to DNS tools and a few other sites for checking this and it shows the same thing. I recently went into the DNSmgmt on our server (wink2k3sp1) and created an A record under the DomainDnsZones which translated to a Cname when you look at it. Under the Domain it has two entries one to the subdomain and the other to the website host each with separate ip addresses. Is this correct? The website people are too busy on another project to research it further and my friends haven't gotten back to me. Please help. Thanks KK

    Read the article

  • Password protect an alias virtual difrecory

    - by Jason
    I have a main domain being hosted through CPanel. I also have a sub-domain that I would like to appear as a path under the main domain instead of as a sub-domain. So I have: http://example.com/ pointing to the main hosted file. http://example.com/mydir pointing to the subdomain files. This is achieved by a httpd.conf include from the main domain section to set an alias: alias /mydir /path/to/subdomain/files/ Now, that works fine so far. The problem is that if a .htaccess file under /path/to/the/subdomain/files/ contains an error, the alias is completely skipped, and /mydir goes instead to the main host files. That is kind of surprising to me - I would expect an error to return an error instead. Now the killer: if I try to password protect /path/to/subdomain/files/, then trying to access http://example.com/mydir will again attempt to deliver from under the main hosted files and not from /path/to/subdomain/files/ I am not seeing any errors reported on the .htaccess file in the apache error log, so I am assuming the .htaccess is valid: AuthUserFile /path/to/valid/readable/.htpasswd AuthName "Secure Access" AuthType Basic Require valid-user This kind of behaviour does not seem right to me. Is there something obvious that could be causing it? Or is this just the way it works? Perhaps using an alias is the wrong way to go?

    Read the article

  • can canonical links be used to make 'duplicate' pages unique?

    - by merk
    We have a website that allows users to list items for sale. Think ebay - except we don't actually deal with selling the item, we just list it for sale and provide a way to contact the seller. Anyhow, in several cases sellers maybe have multiple units of an item for sale. We don't have a quantity field, so they upload each item as a separate listing (and using a quantity field is not an option). So we have a lot of pages which basically have the exact same info and only the item # might be different. The SEO guy we've started using has said we should put a canonical link on each page, and have the canonical link point to itself. So for example, www.mysite.com/something/ would have a canonical link of href="www.mysite.com/something/" This doesn't really seem kosher to me. I thought canonical links we're suppose to point to other pages. The SEO guy claims doing it this way will tell google all these pages are indeed unique, even if they do basically have the same content. This seems a little off to me since what's to stop a spammer from putting up a million pages and doing this as well? Can anyone tell me if the SEO guy's suggestion is valid or not? If it's not valid, then do i need to figure out some way to check for duplicated items and automatically pick one of the duplicates to serve as an original and generate canonical links based off that? Thanks in advance for any help

    Read the article

< Previous Page | 161 162 163 164 165 166 167 168 169 170 171 172  | Next Page >