Search Results

Search found 9763 results on 391 pages for 'ys pro'.

Page 81/391 | < Previous Page | 77 78 79 80 81 82 83 84 85 86 87 88  | Next Page >

  • Stop bots from crawling old links with extensions

    - by Jared
    I've recently switched to MVC3 which is extension-less for the URL's, but Google and Bing have a wealth of links that they are crawling which no longer exist. So I'm trying to find out if there is a way to format robots.txt (or by some other method) to tell google/bing that any link that ends in an extension isn't a valid link... Is this possible? On pages that I'm concerned about a User having saved as a fav I'm displaying a 404 page that lists the links to take once they are redirected to the new page (I decided to not just redirect them as I don't want to maintain these forever). For Google/Bing sake I do have the canonical tag in the header. User-agent: * Allow: / Disallow: /*.* EDIT: I just added the 3rd line (in text above) and it APPEARS to do what I'm wanting. Allow a path, but disallow a file. Can anyone confirm this?

    Read the article

  • Server Firewall preventing sending of email [migrated]

    - by Jo Fitzgerald
    The firewall on my VPS appears to be preventing my site from sending email. It was working fine until the end of last month. My hosting provider (Webfusion) has been next to useless. I am able to send email if I open INPUT ports 32768-65535, but not if these ports are closed. Why would this be? I have the following rules in my firewall: # sudo iptables -L Chain INPUT (policy DROP) target prot opt source destination VZ_INPUT all -- anywhere anywhere Chain FORWARD (policy DROP) target prot opt source destination VZ_FORWARD all -- anywhere anywhere Chain OUTPUT (policy DROP) target prot opt source destination VZ_OUTPUT all -- anywhere anywhere Chain VZ_FORWARD (1 references) target prot opt source destination Chain VZ_INPUT (1 references) target prot opt source destination ACCEPT tcp -- anywhere anywhere tcp dpt:www ACCEPT tcp -- anywhere anywhere tcp dpt:https ACCEPT tcp -- anywhere anywhere tcp dpt:smtp ACCEPT tcp -- anywhere anywhere tcp dpt:ssmtp ACCEPT tcp -- anywhere anywhere tcp dpt:pop3 ACCEPT tcp -- anywhere anywhere tcp dpt:domain ACCEPT udp -- anywhere anywhere udp dpt:domain ACCEPT tcp -- anywhere anywhere tcp dpts:32768:65535 ACCEPT udp -- anywhere anywhere udp dpts:32768:65535 ACCEPT tcp -- localhost.localdomain localhost.localdomain ACCEPT udp -- localhost.localdomain localhost.localdomain Chain VZ_OUTPUT (1 references) target prot opt source destination ACCEPT tcp -- anywhere anywhere ACCEPT udp -- anywhere anywhere The VPS is running Plesk 10.4.4 (please ask if you require further technical information to help me)

    Read the article

  • Transfer .com domain to GoDaddy - websites running on same domain - 3 weeks left until expiration, 2 days left web hosting

    - by Eric Nguyen
    Our company purchased this abc.com domain from a local registrar. The domain will expire in about 3 weeks. We have our main websites running on this abc.com domain and they cannot be down for too long. The web hosting service will end in 2 days. Our websites are already hosted and they are up and running on Amazon EC2. We would like to transfer the domain to GoDaddy now or as soon as possible. (since we have many other domains there and we belive GoDaddy will be better in long-term considering the prices and the features it offers) There are many questions on the decision to transfer the domain to GoDaddy: 1) Cost and time required to move out of our local registrar? This is currently unknown as I'm still trying to retrieve the agreement we have with them 2) How does the 3 week time left until expiration of the domain matters here? Should we wait until the domain expires and then purchase in through GoDaddy? How long would such process take as I suppose our websites will be down during that time? Any other drawbacks? 3) What can I do to ensure our websites will continue functioning regardless of the domain transfer process? It seems the actual registrar here is enom.com and the local registrar here just partners with it I suppose I should then park the abc.com domain with enom.com and make changes to DNS settings so that our websites can continue to be hosted on EC2 as normal. How long does it normally take the domain to be transferred to GoDaddy completely? Is it even possible at all to keep our websites are up and running during the whole domain transfer process? Apologies that I'm throwing many questions at the same time here. It's rather last minutes and I suddenly realised there are too many unknown risks.

    Read the article

  • % new visitor vs. % returning visitor

    - by Torben Gundtofte-Bruun
    I'm not sure how to interpret the results in Google Analytics. I understand that some metrics should be high, and some should be low. But this one I don't get: % new visitor vs. % returning visitor: It's good that users are returning, but surely it's also good to get new, fresh visitors. How do I evaluate this %-vs-% ratio? The higher the better: visits unique visitors pageviews pages per visit avg. visit duration The lower the better: bounce rate drop-offs

    Read the article

  • Google Analytics checkout page tracking problem

    - by Amir E. Habib
    I am running a multilingual website, each lang on a different domain name. I am trying to lead all purchase requests to the checkout progress, which has its own domain too. In order to keep Google Analytics tracking I've updated the Google Analytics code accordingly. I set the source domain to 'multiple top-level domains'. Everything is going fine so far unless in E-commerce Overview; the "Sources / Medium" is always showing as (direct) - or the name of the source domain. Since I am redirecting using PHP header(location:.. etc.) the Google _link method doesn't seem to be working properly - I want to focus on two questions: Should I create a new profile for the checkout domain in Google Analytics? (I am now using the profile ID of the source domain even though I move to the checkout domain, si that OK?) When I'm trying to pass the cookies of the source domain to the checkout domain, I notice that the Google cookies are copied to the new domain (the cookie path is .checkout-domain/) and they have the same values of the original cookies - But for some reason another set of cookies is created once I access a page with google analytics code in the checkout pages, with different values (same path). Feels like I'm doing something wrong here, so my question is - What am I doing wrong here? Does anyone have an idea how to pass the cookies to the checkout domain?

    Read the article

  • .htaccess rewrite www to non-www and remove .html

    - by lester8891
    I need rewrite rules to redirect the following: http://www. to http:// /file.html to /file I've tried using a combination of these but each time it results in a redirect loop on one of the situations RewriteBase / RewriteRule ^(.*)\.html$ $1 [NC] RewriteCond %{HTTP_HOST} ^www.domain.co.uk [NC, L] RewriteRule ^(.*)$ http://domain.co.uk/$1 [R=301] I figure it's probably something to do with the flags but I don't know how to fix it. Just to be clear it needs to do all these situations: http://www.domain.co.uk to http://domain.co.uk http://www.domain.co.uk/file.html to http://domain.co.uk/file http://domain.co.uk to http://domain.co.uk http://domain.co.uk/file.html to http://domain.co.uk/file Thanks!

    Read the article

  • XAMPP - Unable to serve files larger than ~30MB [on hold]

    - by Sparx401
    I'm developing a site locally with XAMPP on Windows 7, and as far as media is concerned, I'm unable to play media files that are larger than 30MB or so. Both video and audio files (MP4 and MP3 respectively) generate this error in Chrome (and show similar errors in other browsers such as IE9 and Opera): No data received Unable to load the webpage because the server sent no data. Error 324 (net::ERR_EMPTY_RESPONSE): The server closed the connection without sending any data. It seems that the exact number of MB somewhat varies between browsers though. One video in question is 34MB and actually plays in Opera and IE9, but gives the aforementioned error in Chrome. I've checked to make sure the file paths were typed correctly and ensured that the directive for .htaccess is there to serve MP4s: AddType video/mp4 mp4 Also, I have these directives set as well in the same .htaccess file: php_value upload_max_filesize "80M" php_value post_max_size "80M" php_value max_input_time 60 php_value max_execution_time 60 And memory_limit is set in php.ini as "128M" so I'm left wondering: what is causing my files to not play, and what, if any, directives I have to change on the server-side? Perhaps something to do with limitations with the GET method (the method I'm seeing on Chrome's network tab among other header request/response info)?

    Read the article

  • Does sitewide html refactoring affect Google traffic?

    - by Name
    Good morning, I have recently made a big structural change on my site and the very next day the number of Google impressions went from 75.000 to 3.000, with a proportional drop of traffic from searches. No URLs were changed, neither were the page titles or descriptions. Everything is exactly the same, but different looking, except that it does barely appear on Google anymore. Anybody has a clue to why?

    Read the article

  • How to find the Fastest DNS servers to host our domain?

    - by Denis Volovik
    The question was born because lately we've seen a pretty odd (well, at least for us, for the first time) - error message in Google webmaster tools - "DNS lookup timeout" ... I was pretty sure that with eNom's 5 DNS servers (dns1... to dns5.name-services.com) we're pretty set... But it appears that from (Europe/Hungary), for example - dns1.name-services.com takes 170ms. to respond on a ping... while GoDaddy's ns75.domaincontrol.com - takes only 40 ms. to respond... and at the same time - dns2 to dns5.name-services.com - each result with a timeout error (on ping)... This issue came to our attention right in the final stages of optimizing our web-site (almost to death) - basically, just in time... I would love to move our domains to a fast (fastest?) and reliable DNS server.. - but how do I find one ? Also - I did the ping tests from various geographic locations (we have servers in many countries) and GoDaddy seemed to be faster than eNom almost in every case. I'd be very thankful for any hints on this! Edited: Well.. maybe this one does not have an answer, after all...

    Read the article

  • Submitting a sitemap to take care of inherited Google crawler errors

    - by leeand00
    I have an awful lot of Google Crawler errors (1000 or so) after I inherited a site that the previous owner migrated without moving much of their content. Would generating a map of the current site and submitting it to Google help fix this? Is there any quicker, automated way to eliminate errors other than clicking each and every site error? Note: I have already tried automating this on my own.

    Read the article

  • Windows 7 - traceroute hop with high latency! [closed]

    - by Mac
    I've been experiencing this problem for quite a while, and it's quite frustrating. I'll do a traceroute, to www.l.google.com, for example. This is the result (please note: I will replace some parts of personal information with text - i.e. ISP.IP is in reality an actual IP address, and ISPNAME replaces the actual ISP name): Tracing route to www.l.google.com [173.194.34.212] over a maximum of 30 hops: 1 1 ms 1 ms <1 ms 192.168.1.1 2 9 ms 8 ms 10 ms ISP.EXCHANGE.NAME [ISP.IP.172.205] 3 161 ms 171 ms 177 ms host-ISP.IP.215.246.ISPNAME.net [ISP.IP.215.246] 4 12 ms 9 ms 10 ms host-ISP.IP.215.246.ISPNAME.net [ISP.IP.215.246] 5 10 ms 9 ms 17 ms host-ISP.IP.224.165.ISPNAME.net [ISP.IP.224.165] 6 10 ms 9 ms 10 ms 10.42.0.3 7 9 ms 9 ms 10 ms host-ISP.IP.202.129.ISPNAME.net [ISP.IP.202.129] 8 10 ms 9 ms 9 ms host-ISP.IP.209.33.ISPNAME.net [ISP.IP.209.33] 9 77 ms 129 ms 164 ms host-ISP.IP.198.162.ISPNAME.net [ISP.IP.198.162] 10 43 ms 42 ms 43 ms 72.14.212.13 11 42 ms 42 ms 42 ms 209.85.252.36 12 59 ms 59 ms 59 ms 209.85.241.210 13 60 ms 76 ms 68 ms 72.14.237.124 14 59 ms 59 ms 58 ms mad01s08-in-f20.1e100.net [173.194.34.212] Trace complete. Notice that there is a spike on the 3rd hop, but also notice that the 3rd and 4th hop are to the exact same destination. Furthermore, when I ping the offended hop separately, I get the low latency I would expect to that server: Pinging ISP.IP.215.246 with 32 bytes of data: Reply from ISP.IP.215.246: bytes=32 time=10ms TTL=253 Reply from ISP.IP.215.246: bytes=32 time=9ms TTL=253 Reply from ISP.IP.215.246: bytes=32 time=12ms TTL=253 Reply from ISP.IP.215.246: bytes=32 time=9ms TTL=253 Reply from ISP.IP.215.246: bytes=32 time=10ms TTL=253 Reply from ISP.IP.215.246: bytes=32 time=9ms TTL=253 Reply from ISP.IP.215.246: bytes=32 time=10ms TTL=253 Reply from ISP.IP.215.246: bytes=32 time=9ms TTL=253 Reply from ISP.IP.215.246: bytes=32 time=10ms TTL=253 Reply from ISP.IP.215.246: bytes=32 time=10ms TTL=253 Ping statistics for ISP.IP.215.246: Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 9ms, Maximum = 12ms, Average = 9ms I'm baffled as to why or how this is happening, and it seems to "fix itself" at random times. Here is an example of where it was working as expected: http://i.imgur.com/bysno.png Notice how many fewer hops were taken. Please note that all the posted results occurred within 10 minutes of testing. I've tried contacting my ISP, and they seem clueless; in their eyes, as long as "the download speed is not slow", then they're doing everything right. Any insight would be very much appreciated, and thanks in advanced!

    Read the article

  • How do you exclude yourself from Google Analytics on your website using cookies?

    - by Keoki Zee
    I'm trying to set up an exclusion filter with a browser cookie, so that my own visits to my don't show up in my Google Analytics. I tried 3 different methods and none of them have worked so far. I would like help understanding what I am doing wrong and how I can fix this. Method 1 First, I tried following Google's instructions, http://www.google.com/support/analytics/bin/answer.py?hl=en&answer=55481, for excluding traffic by Cookie Content: Create a new page on your domain, containing the following code: <body onLoad="javascript:pageTracker._setVar('test_value');"> Method 2 Next, when that didn't work, I googled around and found this Google thread, http://www.google.com/support/forum/p/Google%20Analytics/thread?tid=4741f1499823fcd5&hl=en, where the most popular answer says to use a slightly different code: SHS Analytics wrote: <body onLoad="javascript:_gaq.push(['_setVar','test_value']);"> Thank you! This has now set a __utmv cookie containing "test_value", whereas the original: pageTracker._setVar('test_value') (which Google is still recommending) did not manage to do that for me (in Mac Safari 5 and Firefox 3.6.8). So I tried this code, but it didn't work for me. Method 3 Finally, I searched StackOverflow and came across this thread, http://stackoverflow.com/questions/3495270/exclude-my-traffic-from-google-analytics-using-cookie-with-subdomain, which suggests that the following code might work: <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setVar', 'exclude_me']); _gaq.push(['_setAccount', 'UA-xxxxxxxx-x']); _gaq.push(['_trackPageview']); // etc... </script> This script appeared in the head element in the example, instead of in the onload event of the body like in the previous 2 examples. So I tried this too, but still had no luck with trying to exclude myself from Google Analytics. Re-iterate question So, I tried all 3 methods above with no success. Am I doing something wrong? How can I exclude myself from my Google Analytics using an exclusion cookie for my browser?

    Read the article

  • rich snippets ignored by google [closed]

    - by Thoir Fáidh
    Possible Duplicate: Why would Google Rich Snippets work for one site author but not another? I'm facing one problem here. I made rich snippets - microdata for the website but google ignores all of them. Here is how it looks like in testing tool . It doesn't detect any errors. I've read that google ignores the microdata in hidden fields. Unfortunately this is partially the case since I use jquery to interact with the contect, but nevertheless it is not hidden everywhere and I believe that google should recognize at least the microdata visible to the user permanently. Am I missing something here? It is now about 3 weeks since I updated website with rich snippets.

    Read the article

  • Removing Duplicate Data From SQL Query Output For Display On A Web Page [migrated]

    - by doubleJ
    I had asked a similar question on stackoverflow but didn't really get anywhere. This page shows the output that I'm currently getting from my MSSQL server. I have a table of venue information (name, address, etc...) that our events happen on. Separately, I have a table of the actual events that are scheduled (an event may happen multiple times in one day and/or over multiple days). I join those tables with this query: <?php try { $dbh = new PDO("sqlsrv:Server=localhost;Database=Sermons", "", ""); $dbh->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION); $sql = "SELECT TOP (100) PERCENT dbo.TblSermon.Day, dbo.TblSermon.Date, dbo.TblSermon.Time, dbo.TblSermon.Speaker, dbo.TblSermon.Series, dbo.TblSermon.Sarasota, dbo.TblSermon.NonFlc, dbo.TblJoinSermonLocation.MeetingName, dbo.TblLocation.Location, dbo.TblLocation.Pastors, dbo.TblLocation.Address, dbo.TblLocation.City, dbo.TblLocation.State, dbo.TblLocation.Zip, dbo.TblLocation.Country, dbo.TblLocation.Phone, dbo.TblLocation.Email, dbo.TblLocation.WebAddress FROM dbo.TblLocation RIGHT OUTER JOIN dbo.TblJoinSermonLocation ON dbo.TblLocation.ID = dbo.TblJoinSermonLocation.Location RIGHT OUTER JOIN dbo.TblSermon ON dbo.TblJoinSermonLocation.Sermon = dbo.TblSermon.ID WHERE (dbo.TblSermon.Date >= { fn NOW() }) ORDER BY dbo.TblSermon.Date, dbo.TblSermon.Time"; $stmt = $dbh->prepare($sql); $stmt->execute(); $stmt->setFetchMode(PDO::FETCH_ASSOC); foreach ($stmt as $row) { echo "<pre>"; print_r($row); echo "</pre>"; } unset($row); $dbh = null; } catch(PDOException $e) { echo $e->getMessage(); } ?> So, as it loops through the query results, it creates an array for each record and ends up like this: Array ( [Day] => Tuesday [Date] => 2012-10-30 00:00:00.000 [Time] => 07:00 PM [Speaker] => Keith Moore [Location] => The Ark Church [Pastors] => Alan & Joy Clayton [Address] => 450 Humble Tank Rd. [City] => Conroe [State] => TX [Zip] => 77305.0 [Phone] => (936) 756-1988 [Email] => [email protected] [WebAddress] => http://www.thearkchurch.org ) Array ( [Day] => Wednesday [Date] => 2012-10-31 00:00:00.000 [Time] => 07:00 PM [Speaker] => Keith Moore [Location] => The Ark Church [Pastors] => Alan & Joy Clayton [Address] => 450 Humble Tank Rd. [City] => Conroe [State] => TX [Zip] => 77305.0 [Phone] => (936) 756-1988 [Email] => [email protected] [WebAddress] => http://www.thearkchurch.org ) Array ( [Day] => Tuesday [Date] => 2012-11-06 00:00:00.000 [Time] => 07:00 PM [Speaker] => Keith Moore [Location] => Fellowship Of Faith Christian Center [Pastors] => Michael & Joan Kalstrup [Address] => 18999 Hwy. 59 [City] => Oakland [State] => IA [Zip] => 51560.0 [Phone] => (712) 482-3455 [Email] => [email protected] [WebAddress] => http://www.fellowshipoffaith.cc ) Array ( [Day] => Wednesday [Date] => 2012-11-14 00:00:00.000 [Time] => 07:00 PM [Speaker] => Keith Moore [Location] => Faith Family Church [Pastors] => Michael & Barbara Cameneti [Address] => 8200 Freedom Ave NW [City] => Canton [State] => OH [Zip] => 44720.0 [Phone] => (330) 492-0925 [Email] => [WebAddress] => http://www.myfaithfamily.com ) As you can see, The Ark Church and its associated contact information is duplicated, so when I work with those arrays and output them to the page, I see a bunch of duplicate content. I'd like to remove the duplicate information so that I get results similar to this: The Ark Church Alan & Joy Clayton 450 Humble Tank Rd. Conroe, TX 77305 (936) 756-1988 [email protected] http://www.thearkchurch.org Meetings: Tuesday, 2012-10-30 07:00 PM Wednesday, 2012-10-31 07:00 PM Fellowship Of Faith Christian Center Michael & Joan Kalstrup 18999 Hwy. 59 Oakland, IA 51560 (712) 482-3455 [email protected] http://www.fellowshipoffaith.cc Meetings: Tuesday, 2012-11-06 07:00 PM Faith Family Church Michael & Barbara Cameneti 8200 Freedom Ave NW Canton, OH 44720 (330) 492-0925 http://www.myfaithfamily.com Meetings: Wednesday, 2012-11-14 07:00 PM It doesn't necessarily have to end up like that (I'm not looking for code specific for these results, but a concept of how to not show the duplicated information). I'm assuming that an additional foreach or while will do it, but I haven't figured out any logic that says <?php if ($location == $previouslocation) echo ""; ?>.

    Read the article

  • Send and Receive SMS from my Website [on hold]

    - by Weng Fai Wong
    The way I plan to use it is: Have people send SMS to a number to vote. On the backend (assuming that the data comes back to my Web server), I will display the voting results on my Web site. After say 10 minutes, I would like to press a button on my Web site so ONE of the people who sent an SMS earlier receive an SMS saying that person is a winner. I'm an ASP .Net developer, so I just need an API to code against. One such company I saw that does this (but is limited to US) is: http://www.twilio.com/sms/ Do you know any international providers that are similar to Twilio SMS? I'm based in Sydney, Australia. I've looked through the discussion here but could not find any International provider that does what Twilio SMS does: How to add SMS text messaging functionality to my website? Thank you.

    Read the article

  • Wordpress html email plugin that doesn't merge with your users?

    - by christian
    I just launched a wordpress site as a redesign of my strictly html site and added my blogger blog to Wordpress to keep it all in one place. I am trying to send an html email from my Wordpress site to all of my blogger subscribers encouraging them to subscribe to my new blog and join my newly created membership program, but every html email plugin (that I have found) imports your your users with you Wordpress users before you can send them an email. Is there a plugin that allows me to send an html email to a list of email address without adding them as users? otherwise I will have to go through and delete all of my users (after importing the list and sending the email) and try to pick out my legitimate users who have already signed up all the while watching for new ones to make sure I don't delete them.

    Read the article

  • Web Email Configuration

    - by user1378680
    I just created emails for my website from the cpanel. I then gave links to the cpanel webmail to each owner of the newly created mails. When they tried to login it returened invalid username and password combination. But on my own end they are all working very well. please what could be the problem. This is my first time of doing email configuration and cpanel in general. I will be happy to provide any information that you might need. Thank you

    Read the article

  • Host And Expose Application to local small network

    - by tartak
    I developed a little application (web application) using JavaEE+MySql. I try to keep some data and .. from time to time to get some reports using my data. My problem is I have to access this application from 4-5 computers in the office. They are connected through a switch. It's a typical small office network, nothing fancy. I need some advice on how to do this. I mean for a small application with no external communication is it mandatory to use an Apache machine? I'd use a simple Tomcat container on the "server machine" (which is my computer, a windows machine) and .. basically .. I would like to permit the access to my colleagues also. I don't have any knowledge about concurrency (I know mysql permits concurrent access) so I would like some configuration tips also.

    Read the article

  • Informing Googlebot for deprecated pages

    - by trante
    I publish timetables in my website. For example last year I published Number 2 bus Summer 2013 timetable. I has pretty good ranking on Google SERPs for number 2 bus timetable But this year I added a new page with the name "Number 2 bus Summer 2014 timetable". When users search number 2 bus timetable in Google, they find 2013 timetable in first page of SERPs. But I want them to find 2014 timetable. Thy can reach 2014 page with the keywords number 2 bus timetable 2014. But most of the users doesn't write year name. So what's the proper way to say Googlebot that 2013 page is deprecated and newer version is 2014 page ? I created a link from 2013 page to 2013 page and added a deprecation alert for visitors. But I still see 2013 timetable in first page of Google SERPs. Of course it is possible to 301 redirect, 2013 page to 2014 page. But I want users to reach old pages to compare the differences between years. (As you would guess I have many pages like this.) Edit: Why I don't put timetables on same page and show different years' timetables with sorting. Because my old pages has good pagerank scores or SERPs. Removing these old page will remove them.

    Read the article

  • Handling SEO for Infinite pages that cause external slow API calls

    - by Noam
    I have an 'infinite' amount of pages in my site which rely on an external API. Generating each page takes time (1 minute). Links in the site point to such pages, and when a users clicks them they are generated and he waits. Considering I cannot pre-create them all, I am trying to figure out the best SEO approach to handle these pages. Options: Create really simple pages for the web spiders and only real users will fetch the data and generate the page. A little bit 'afraid' google will see this as low quality content, which might also feel duplicated. Put them under a directory in my site (e.g. /non-generated/) and put a disallow in robots.txt. Problem here is I don't want users to have to deal with a different URL when wanting to share this page or make sense of it. Thought about maybe redirecting real users from this URL back to the regular hierarchy and that way 'fooling' google not to get to them. Again not sure he will like me for that. Letting him crawl these pages. Main problem is I can't control to rate of the API calls and also my site seems slower than it should from a spider's perspective (if he only crawled the generated pages, he'd think it's much faster). Which approach would you suggest?

    Read the article

  • Is this a DNS or server-side error?

    - by joshlfisher
    I am having difficulty accessing a specific website. (I get 500 Server fault errors) I can access this site on my iPhone when NOT connected to WiFi. I CANNOT access the site when connected to WiFi or via a Ethernet connection to my home network. I thought it might be a DNS issue, so I copied the DNSservers from a friend who has a different ISP, and has no problem access the site. No luck. Also tried some of the public DNS servers out there, again, with no luck. Does anyone have any idea on how to trace this issue?

    Read the article

  • Error using SoapClient() in PHP [migrated]

    - by Dhaval
    I'm trying to access WSDL(Web Service Definition Language) file using SoapClient() of PHP. I found that WSDL file is authenticated. I tried with passing credentials on an array by another parameter and active SSL on my server, still I'm getting an error. Here is the code I'm using: $client = new SoapClient("https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl",array("trace" = "1","Username" = "username","Password" = "password")); Here is the error I'm getting: Warning: SoapClient::SoapClient(https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl) [soapclient.soapclient]: failed to open stream: Connection timed out in PATH_TO_FILE on line 80 Warning: SoapClient::SoapClient() [soapclient.soapclient]: I/O warning : failed to load external entity "https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl" in PATH_TO_FILE on line 80 Fatal error: Uncaught SoapFault exception: [WSDL] SOAP-ERROR: Parsing WSDL: Couldn't load from 'https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl' : failed to load external entity "https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl" in PATH_TO_FILE:80 Stack trace: #0 /home2/wingstec/public_html/widget/API/index.php(80): SoapClient-SoapClient('https://webserv...', Array) #1 {main} thrown in PATH_TO_FILE on line 80 It seems that error says file not exist at the path we given but when we run that path directly on browser then we're getting that file Can anyone help me to figure out what the exactly problem is?

    Read the article

  • parameters in a seo url

    - by Marius
    This should be a very simple question for seo experts. Let's say we have the following URLs: http://www.test.com/some-sort-of-page http://www.test.com/some-sort-of-page?pgid1189 http://www.test.com/some-sort-of-page/page/1189 http://www.test.com/page/1189/some-sort-of-page The first one is an ideal solution. What i need to do is to somehow pass a resource identifier in the url to know exactly what this url is pointing to, since it can be pointing to a lot of different things. In the second URL, "pgid" specifies that the resource is a "page". URLs 3 and 4 specify the same thing differently. I do not care if the URL is friendly to people, because, let's face it - 99.9% of people will never ever ever bother to remember such url no matter how "friendly" it is. So the question is: which of the last 3 URLs would be the best solution for search engines? My guess is it would be the 2nd with query string, but i might be wrong. Thanks for your thoughts P.S. please don't offer using the first url. There's no problem using it, but the question is not about that.

    Read the article

  • How to protect Google Ads from yontoo layers runtime?

    - by Dharmavir
    Since sometime I have observed that Google Ads on any website including my blog (http://blogs.digitss.com) gets replaced with something similar to uploaded image below. I am sure it's happening with many people and that could reduce google adsense income. After some research I found that it is because "yontoo layers runtime" from http://www.yontoo.com/ (tagline says: Platform that allows you to control the websites you visit everyday.) but actually they are taking over. I am not sure with which software they are making a way into users computer but that seems very bad in terms of freedom of Internet and advt/marketing industry. I don't remember I have ever said "yes" to install yontoo on my computer. This piece of software is successful to install itself on my laptop/desktop and workstation at office. I am going to disable it now but the question is how do I make my websites aware of Yontoo Runtime and stop them from replacing Google Ads? Basically they are not able to replace all adsense ads but so far they are successfully replaced 1st instance of adsense advt and I am sure in future they will hit more. There could be 2 approaches 1) Fool yontoo runtime by putting some misleading divs in html document to save actual ads, 2) Completely disable yontoo by working out some client side script (javascript) which can fail/crash yontoo runtime and so will fail it's purpose of replacing ads. You can visit my blog (http://blogs.digitss.com) and see on top-right corner, if you find that google ad replaced with something similar to image attached with question - it means your computer/browser is infected too. Looking forward to reply from webmasters, if someone has already wrote some code/plugin to make website (and google ads) safe from yontoo or similar runtime. FYI: it was able to push this runtime in all browsers installed on machine. So a dangerous threat. And yes, I am just using Google ads - not sure if yontoo runtime is doing trick against other ad networks or not? I am sure they must be doing it with some handful of ad networks.

    Read the article

  • EV SSL Certificates - does anyone care?

    - by pygorex1
    Is any one aware of any data or studies from an impartial source that show the impact of EV SSL certificates on customer behavior? I've been unable to find any such studies. If an EV SSL certificate increases sales on a web store front by even a few points, I can see the value. Aside from data targeted at EV SSL it may be possible to guess at customer behavior based on user interaction with regular SSL certificates. Are users even aware of SSL security? Does regular SSL have any proven effect on web store front sales? Note, that I'm not asking about the necessity of good encryption - I'm asking about a potential customer's perception of security & trust.

    Read the article

< Previous Page | 77 78 79 80 81 82 83 84 85 86 87 88  | Next Page >