Search Results

Search found 28121 results on 1125 pages for 'microsoft surface pro'.

Page 462/1125 | < Previous Page | 458 459 460 461 462 463 464 465 466 467 468 469  | Next Page >

  • My virtual server is created but it's not showing anything when I visit the site?

    - by web designer
    I have created a virtual sever as below for my site: I've specified the folder and domain name: And I created a master zone for the domain, and NS1 and NS2 has been set for the domain. everything seems good, but when I visit the domain I see the below page without some test files that I've put in the root(www directory): <VirtualHost *> DocumentRoot "/home/example.com/www" ServerName example.com <Directory "/home/example.com/www"> allow from all Options +Indexes </Directory> ServerAlias www.example.com </VirtualHost> What I'm doing' wrong?

    Read the article

  • Should I include everything in the sitemap or only new content?

    - by Mee
    For a website with dynamic content (new content is constantly being added), should I only include the newest content in the sitemap or should I include everything (with a sitemap index)? What are the best practices for sitemaps esp. for large sites? Also, is there anyway to make google (and other search engines) only crawl the pages in the sitemap? Thanks Update: Also, any idea how stackoverflow handle this? I'd like to know but unfortunately (also understandingly) they have blocked access to their sitemap.

    Read the article

  • SQL binary value to PHP variable leading zeros

    - by Agony
    Using sql query to pull data from a mssql database results in a value that still has leading zeros. The data in database is stored as binary(13) - so it will pull all 13 digits. However the value is a text, so any leading zeros will generally show up as '?' in a form on the site - and in return will update wrong data to the database later. So what i need is to only select/display the text itself, not all 13 bytes. using: SELECT CONVERT(char,uilock_pw) AS uipwd FROM tbl_UserAccount or SELECT uilock_pw FROM tbl_UserAccount still adds the leading zeros to the char array. Example in database: 0x71776531323300000000000000 Would show up as: qwe123??????? But should be: qwe123 Im not even sure what character those ? represent. Using Echo results in a normal qwe123 - but not in a form.

    Read the article

  • Framework for interaction between web-page and server-side script

    - by Carrier
    I want to make a web-page that will have several controls elements, among which there are elements like check-boxes, radio-buttons, "range selectors" (one can specify the min and max value, like it is done when you select range for prices in the online markets). The new values shall be sent to the "server-side", once changed (without any Submit buttons etc), and the "server-side" can return something (one or more numbers, etc). Does anyone know a good ajax-like framework that allows (with minimal adaptation / changes) to make such solution in an easy way? It will be good if the server-side of existing solution will be in Perl (not a big deal, but I know it much better than PHP or something else). Set of controls might change and depend on other parameter, so adding one extra element should not cause rewriting the whole thing. P.S.: I haven't working in this area for quite a while, so not aware of existing solutions in this area, and don't want to invent the wheel and write everything from scratch for something that already exist (at least, I hope so). Thanks in advance!

    Read the article

  • mysql database cannot connect with cpanel [closed]

    - by Rafee
    And this question was asked on http://stackoverflow.com/questions/8182119/mysql-database-cannot-connect-with-cpanel <?php $con1 = mysql_connect("mywebsiteip","mysql_username","mysql_user_password"); if(!$con1) { die ("Could not connect " . mysql_error()); } else { echo "Good connection"; } mysql_close($con1); ?> When i run it, it cannot connect to mysql database over cpanel. and i even tried up $con1 = mysql_connect("mywebsiteip:portnumber","mysql_username","mysql_user_password"); Can any let me know, which one is good way.

    Read the article

  • Finding 301 redirects

    - by php-b-grader
    I have a URL which is 301 redirecting but I cannot find where or how it is happening and wanted some checks to perform if posible? I've checked .htaccess - it's not there I've checked CPanel in Redirects - it's not there In wordpress, I have the redirection plugin active and it's not there either. Is there anywhere else that could be issuing redirects? I'm at a loss to find out where and how the page is redirecting!

    Read the article

  • How can I create a content widget generator?

    - by Richard
    Sites like Buzzfeed offer widgets (Javascript, PHP, WordPress, etc.) for syndicating their content on other sites. Does anyone have any ideas on how I could go about creating/implementing some kind of interactive widget generator that gives users options for customizing their widget? I assume this would require RSS. Check out the Buzzfeed generator to see what I mean http://www.buzzfeed.com/network/widget

    Read the article

  • Getting started with SOAP [closed]

    - by EmmyS
    A site I developed has a new requirement to get weather data from the National Weather Service. They have quite a bit of info on how to use SOAP to get their data and display it in the browser, but what we need to do is use a cron job to get the data at specific intervals, then parse the data out into a database. I have no problem writing PHP code that will run an XSLt and parse xml records out into SQL queries, but I have no idea how to handle this with SOAP (which I've never worked with.) Do I get the data via a SOAP request, save it to an XML file on my web server, then run the XSLt against that? Or is there some other way to go about this?

    Read the article

  • Are these hacking attempts or something less sinister?

    - by Darkcat Studios
    I just had a look through our web server error logs, and Terminal services is reporting: "Remote session from client name a exceeded the maximum allowed failed logon attempts. The session was forcibly terminated." Hundreds of times, every 10.5 seconds or so for a period of about 5-10 minutes, once at 2pm yesterday and once again at about 1am this morning. We CURRENTLY have RDP open to the outside, as I am just completing the setup and now and then I/Others need to jump on from an outside office/location (VPN isn't an option) As these are so regular, am I right in assuming that they may be the result of some sort of dictionary attack? or could something like an internal admin's hung session cause such a mass of events? (Win Server 2008 R2)

    Read the article

  • Does Google pick up anchor text that is in nested elements?

    - by dangerDAN
    When Google looks at anchor text on a website, will it pickup the text if it is inside nested elements? So for example: <a href="http://www.google.com/">Visit Google</a> To: <a href="http://www.google.com/"> <div class="circle"> <span>Visit Google</span> </div> </a> The reason I ask is because I want to use css3 elements for certain links on my website, to style them as circles. But the anchor text needs to be picked up for these links, so I want to know wether or not the above is bad practice in this case.

    Read the article

  • Posterous instructions for adding Google Analytics do not cover the code snippet to be pasted in the <head>

    - by Kit
    The Posterous instructions for adding Google Analytics ends with pasting the Google Analytics Domain ID into the settings page. However, the instructions given by Google tell me to paste some JavaScript code into the <head>. How do I get around with this? Do I need to paste the JavaScript code? If I do need to paste the JavaScript code, can I just paste it into the custom HTML/CSS style specification of my Posterous Space?

    Read the article

  • possible to use an IP derived from Dynamic DNS in htaccess IP allow/deny commands?

    - by user115745
    On a website I manage, I want to use an .htaccess file to allow access to a certain administrative directory only from my home IP address, which is dynamically assigned by my ISP and therefore changes -- not regularly, but it does happen. I also have an account from DynDNS and have one of the auto-update clients making sure it always points to my actual home IP address. I don't actually host anything at home; I just have set up the Dynamic DNS account. Is there any way to combine these features: that is, is it possible write the .htaccess allow/deny commands at my outside webhost in a way that my home IP address is not hard coded into the command, but instead is somehow derived from the Domain Name that the DynDNS has assigned me, by doing a real-time lookup every time the directory's .htaccess file is hit? Thank you.

    Read the article

  • What are the Search engine affects of registering the same domain on multiple top level domains (ie. .com, .ie, .nl etc.)?

    - by user1020317
    I'm looking to register a few more domains for my company, I have my-company.com at the moment, but now require my-company.com.au and my-company.nl and some others.. I'm running through my options and wondering what is the best.. Duplicate all the content on the .com package and make a replica at the other domains Buy the other domains but do a 301 redirect back to the .com domain. Create a full new website with different content for the new domains, thus having no text duplication We currently sell over the world so would like to raise our Search rankings in various countries, can this be done by buying the domain in the country, and if so, how will the above methods affect our search rankings. Any other suggestions are welcome!

    Read the article

  • Getting web results URLs in millions [closed]

    - by tereško
    I looked at all sites of SO and couldn't find any suitable to ask this question but posting here as nearest match to scenario After 1 months research I basically give up on getting all URL's from a search results programmatically, I looked at Google Search API to find a way to get millions of search results "URL's" to be specific to a text file or something relative but no success, but I am 100% there must be a way or trick of doing it. Real Question : Is there anyway programmatically or manually I can get 1000+ search results (URLs using search query e.g. "Apple" returns million of results on google and I want as much as possible URLs of them results in a text file)

    Read the article

  • Impact of migrating home page from http to https on search results

    - by 2Stroke SEO
    Guys - I've just had to change one of my site's home page to https from http. I had plenty links coming into the http page and was performing well in Google against many of my targeted search phrases. I did a 301 redirect from the http page to transfer the link juice to the https page (and to prevent duplicate content issues), but my search rankings have tanked which indicates no link juice has been transferred. My PR has vanished - which I'd expect - but I'm really surprised that the SERP rankings fell off the face of the earth. Anyone have any ideas how I can recover this. I've waited a couple of months since the changes took effect just in case Google was taking time to check it out.

    Read the article

  • PHP URL Rewrite engine for small project

    - by Jens Törnell
    I use PHP. I want to setup a micro site as a prototype, where I can work with the frontend only, separated from any CMS. URL Rewrite I also want the URL rewrite to be correct, like http://www.test.com/products/tables/green/little-wood123/ Question(s) Is there any free class for URL rewriting? I searched but found none. If that is not the way to go, what framework is nice for this? It should be tiny, easy to use and support URL rewrite.

    Read the article

  • AdWords traffic not (properly) reflected in Analytics

    - by CJM
    I have an AdWords account, which was set to use Auto-tagging of URLs. When looking at the Analytics account for that site, I couldn't find any reference to AdWords traffic either in the Advertising section or the Traffic Sources section. So I manually constructed the URL tags, and updated the Campaign Ad. Once the ad was approved and the clicks started coming through again, I could see the results in the Traffic Sources section of Analytics. In the Sources Campaigns section, my campaign was listed, and under Sources All Traffic, it was registering the same level of traffic from google/adwords. However, the Advertising AdWords section is still drawing a blank. Any ideas? Are there explicit steps needed to enable full tracking of AdWords campaigns? If it is relevant, the Adwords campaign was set up with one account, and the Analytics tracking with another, but both accounts have full access to both AdWords and Analytics.

    Read the article

  • Google index iframes on Facebook fan pages? (Hole website content)

    - by user2536417
    I have a fairly simple question that I've tried to get help from the guys on the Google Webmaster Help Q&A site but so far no joy so hopefully someone here can provide me with the information I'm looking for. I have a Facebook fanpage for my website, I have made an app that basically uses an iframe and puts the site within a frame within Facebook. All works good but Google is not indexing this page. I am using <link rel="canonical" href="#" /> on my pages so prehaps this is an issue?

    Read the article

  • Search Result Organization

    - by Vecta
    I'm creating an AJAX live search on a website I'm working on. Users will select values from a few dropdowns and a list of products will be returned based on what they select. Some possible fields would be: color, model, make, etc. What type of organization of search results do users tend to find most useful? Is it better to lump them all together (alphabatized) or is it more useful to lump them together by make? In the past I've tended to group them by "make" but I'm not concerned that this will continually force some items with a make toward the end of the alphabet always to the bottom of the list. Any tips are greatly appreciated.

    Read the article

  • Best CMS for review-type sites

    - by Pru
    Is there an ideal CMS for making a review site? By review site, I mean like a restaurant review site where you have each entry belonging to different major categories like Cuisine and City. Then users can browse and filter by each or by combination (Chinese Food in Los Angeles, with suggestions of other Chinese restaurants in LA, etc). Furthermore, I'd want it to support other fields like price, parking, kid-friendliness, etc. And to have users be able to filter by those criteria. I've been told that with a combination of custom taxonomies, plug-ins and many clever little queries, that Wordpress 3.x can handle this. But I'm having a heck of a time with it getting into the nitty gritty, and that's where I find the community support is lacking. The sort of stuff you'd think would work in WP, like making one parent category for Cuisine and one for City, don't really work once you get further in and start trying to pull it all together. Then you find these blog posts where people say, "This example shows that one could create a huge movie review site using custom taxonomies..." but when you go and try it you hit all sorts of challenges and oddities that point a big long finger at Wordpress being in fact a blogging platform. The best I came up with was one category for the cuisine and one tag for the city, then I created a couple of custom tag-like taxonomies for the other features. It's quite a mess to try to figure out how to assemble all of that into a natural, intuitive site. I expect a few versions down the road WP will be able to do these sorts of sites out of the box. So I thought I'd take a step back before I run back into the Wordpress fray and find out if maybe there is another platform better suited to this sort of relational content site. Directory scripts in some ways offer many of the features I'm looking for, but I need something more flexible and, hopefully, interactive (comments, reviews). I'm especially looking for feedback from people who've crafted sites like this. Thanks!

    Read the article

  • SEO and domain name - which shape?

    - by user984621
    I just want to register the domain name for my spanish class and wonder, what domain name is beter for this purpose: learningspanish.com or ilearnspanish.com Which one is better? The domain name must be English, but I don't know, what is better for Google and SEO - if learn or learning... I would be grateful for your feedback and sorry if the explanation above is not understandable (I would try to explain it better). Thank you

    Read the article

  • Facebook contest policy no-no?

    - by Fred
    I would like to post a link on a Facebook page where it will exit Facebook entirely and go to a client's website, where people will be on a page (client's) where they can enter their e-mail address to be entered in a temporary database file with rules and disclosures etc., for a draw once the number of entries reaches 100 for instance. Once the number of entries reaches 100, a random winner is picked and notified via E-mail. The functionality is as follows: A link is place on a Facebook page leading to an external page The page is a form to merely enter their email address for a contest The email is placed in a temporary file An automatic E-mail is sent to the address used for confirmation using SHAH-256 hash The person receives the Email saying something to the affect "Please confirm your Email address etc. - If you did not authorize this, simply ignore this message and no further action will be taken". If the person clicks on the confirmation link, the Email is then stored in the database and the person is again notified saying "Thank you for signing up etc." Once others do the same process and the database reaches a certain number, the form is no longer accessible and automatically picks a random Email. Once picked, an Email is automatically sent to the winner stating the instructions, and notifying me also. Once that person clicks yet another confirmation link, the database is then automatically deleted. I have built this myself and have no intentions of breaking any rules, nor jeopardize the work/time/energy I have put into this project. Is this allowed?

    Read the article

  • SEO/Google: How should I handle multiple countries and domains?

    - by Valorized
    Hello. I'm the webmaster of an online shop based in Austria (Europe). Therefore we registered "example.at". We also own different other domain names like "example-shop.com" and "example.info". Currently all those domains are redirected (301) to the .at one. Still available is: "example.net" and "example.org" (and .ws/.cc), unfortunately not available: .de/.eu The .com is currently owned by one of our partners, the contract ends in 2012 but until then we have no chance to get this one. Recently I read more about geo-targeting and I noticed ONE big deal. The tld ".at" is hardly recognised in Germany (google.de) whereas it is excellently listed in Austria (google.at). As a result of the .at I cannot set the target location manually (or to unlisted). More info: https://www.google.com/support/webmasters/bin/answer.py?answer=62399&hl=en This is a big problem. I looked at Google Analytics and - although Germany is 10x as big as Austria - there are more visits from Austria. So, how should I config the domain in order to get the best results in both, Germany and Austria? I thought of some solutions: First I could stop redirecting the .info. Then there would be a duplicate of the .at one. Moreover, in Webmastertools, I could set the target location of the .info to Germany. As the .at still targets Austria, both would be targeted - however I don't now if google punishes one of them because of the duplicate content? Same as 1. but with .net or .org (I think .info is not a "nice" domain and moreover I think search engines prefer .com, .net or .org to .info). Same as 1. (or 2.) but with a rel="canonical" on the new one (pointing to the .at). Con: I don't think this will improve the situation, because it still tells google that the .at one is more important, like: "if .info points to .at, the target may still be Austria". rel="canonical" on the .at pointing to the new (.info or .net or .org). However I fear that this will have a negative impact on the listing on google.at because: "Hey, the well-known .at is not important anymore, so let's focus on the .info which is not well-known." - Therefore: bad position in search results. Redirect .at to the new (.info or .net or .org) with a 301-Redirect. Con: Might be worse than 4, we might loose Page-Rank (or "the value of the page", because google says that page rank is not important anymore). Moreover this might be even more confusing for the customers. In 3. or 4. customers don't get redirected, they do not see the canonical-meta-tag. So, dear experts, please tell me what the best option would be! Thank you very much for your advice in advance and please excuse the long question. I really appreciate this network! Please note: It's exactly the same content AND language. In Austria we speak German.

    Read the article

  • Protecting a webpage with an authentication form

    - by Luke
    I have created an employee webpage with a lot of company info, links, etc., but I want to protect the page because it contains some confidential company information. I am running IIS7.5 on Windows Server 2008 R2, and I already have the site setup as a normal, non-protected site. I want all active directory users to have access to the site. This is not an intranet site, it is exposed to the internet. I tried setting it up using Windows Authentication, but I had problems with multiple login prompts, etc. I just want a simple form for users to enter their credentials and have access to the site, and I need it to query the AD for login. I've searched the web for a guide on this, but I can't seem to find one that fits my situation. This is not a Web App. It is just a simple html site. Does anyone have any suggestions or a link to a guide on this? Thanks so much! -LB

    Read the article

  • http request terminating early

    - by spiderplant0
    I noticed that on some of my sites, images were occasionally not getting downloaded fully. After a bit of investigation it appears that it is not restricted to images - .css, .js etc were also occasionally terminating early. The faults appear to be random. When I use the debug/proxy tool Fiddler2 reports that fewer bytes have been received than were requested. Firebug reports "Image corrupt or truncated". Obviously this is mainly a concern between me and my hosting company. However despite many emails they have not been able to get to the bottom of it. Transfer to another hosting company is obviously an option but is really a last resort. Has anyone seen this kind of thing before or can anyone suggest what might be causing it. Or any apache setting or something that I can ask them to check out. Will apache log this kind of error - they havent been able to provide me with any logs, but if I know exactly where things have been logged, maybe I can prompt them in to action.

    Read the article

< Previous Page | 458 459 460 461 462 463 464 465 466 467 468 469  | Next Page >