Search Results

Search found 9717 results on 389 pages for 'it pro'.

Page 128/389 | < Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >

  • Cannot access personal website from home IP. More details inside.

    - by GX67
    This is a recent problem I've been having. My site can be accessed from almost everywhere else except from my home IP, where I do most of my editing/updating, etc. I've tested my connection from my school's network, a friend's connection from out of state (multiple states), and through a tethered connection with my friend's Android. It works in all those cases, both viewing, accessing the cPanel, and using FTP. Here's the problem that happens to me when I try to view it from my home IP: The page times out in Firefox, IE, and Chrome. Using the cmd, I ran tracert and ping, both as failed attempts. Log here. downforeveryoneorjustme.com says my site is up. So do the other site checkers. I can't access my cPanel or FTP accounts. I can't access the host site. (I use perfectz.info for hosting, and I can't access their site either.) System settings: No firewall enabled. Ports are seemingly properly forwarded. (e.g. The ports are open in the router settings, and are open everywhere else.) I have an email forwarder set up from the cPanel that works just fine. (i.e. I can receive emails sent to that address. If any other information is needed, I'll do my best to provide it. UPDATE @ilhan: I use two things: 1) The site cPanel from in-browser. 2) Dreamweaver CS5 FTP. @Matthias: I tested both, and it passes the dual stack with a 10/10. What should I do then?

    Read the article

  • What dangers await if I block non-standard, non-major-usa search engine bots from my USA only website?

    - by Ryan
    I noticed tons of bandwidth being used by non-USA search engine bots, so I began blocking them in an effort to save bandwidth and cpu cycles for actual users and the search engines they come from (Google, Bing, Yahoo, Ask, etc.). Other than potentially losing some international traffic (which isn't really important to us since all of our content is very USA-centric), what additional dangers should I be concerned about? I'm using a modified version of Jeff Starr's User Agent Blocklist

    Read the article

  • <meta name="robots" content="noindex"> in "Fetch as Google"

    - by Rodrigo Azevedo
    I don't know why but when I execute "fetch as Google" it returns me HTTP/1.1 200 OK Cache-Control: private Content-Type: text/html Content-Encoding: gzip Vary: Accept-Encoding Server: Microsoft-IIS/7.5 Set-Cookie: ASPSESSIONIDQACRADAQ=ECAINNFBMGNDEPAEBKBLOBOP; path=/ X-Powered-By: ASP.NET Date: Wed, 26 Jun 2013 15:18:29 GMT Content-Length: 153 <meta name="robots" content="noindex"> The noindex doesn't exist. Does anybody know what could be wrong?

    Read the article

  • Multiple domain links on Google from one WordPress site

    - by user557318
    At present when I Google the domain name of the WordPress sites I have worked on, I receive at least three listings (often the top three). The first listing is the only one I am interested in seeing, others appear from individual pages from that WordPress site i.e.: 1st hit - www.domain.com 2nd hit - www.domain.com/about 3rd hit - www.domain.com/designers Does anybody know if its possible to remove all the links but the absolute www.domain.com?

    Read the article

  • Tag link suggestion plugin for wordpress?

    - by Emerson
    Hi, every time I write a post I make sure I add links to wordsthat I have tags for. For example: "The economy of Brazil has improved in the last few years" this ensure that when people re-post my content, a lot of back-links will be created to my tags. This is quite a lot of work to do manually for every post. It would be cool if there was a plugin that would suggest tags to be applied when they match existing words in the text of the post. Is there such a thing?

    Read the article

  • Sendmail encrypted

    - by user1948828
    I manage a website running on Apache. It has public and private areas. When people apply for an account to access the protected portions of the site, they do a TLS/SSL protected POST containing their information which is saved to a (hopefully) nonpublic directory on the server. Then I have a python script which takes URL Encoded POSTS with this user information, sends back a plaintext confirmation to the applicant, encrypts their information with a freeware java command-line utility to protect it (specifically this one: http://spi.dod.mil/ewizard.htm), base64 encodes them, puts them in a file as a mime attachment and uses sendmail to forward the file information to my (and several coworkers' scattered around the country) email account(s) on an Exchange server with Outlook clients. This has worked well for years, but is awkward because it involves manually decrypting the information on a windows box once it is received, using the above mentioned encryption utility. This significantly limits how many can be processed. I would like to be able to encrypt my information in a format that Outlook/Exchange can inherently understand and display so that these emails can be viewed simply by clicking on them. I do have company provided PKI public certs for all the people I need to send to, and am able to send/receive encrypted emails on Outlook manually, but would like to know how I can send to Outlook from apache/linux/python from the command line using the same PKI certs. Dont need to receive them, just send. Is there a utility that can do this? I had thought pgp might but I havent been able to figure it out.

    Read the article

  • last-modified/etags - to include or not?

    - by Kae Verens
    Google's PageSpeed plugin suggests that a website should include Last-Modified and ETag headers: Specify a cache validator "Resources that do not specify a cache validator cannot be refreshed efficiently. Specify a Last-Modified or ETag header to enable cache validation" However, Apache suggests that by not including them at all, we speed up websites by eliminating If-Modified-Since and If-None-Match requests: http://www.askapache.com/htaccess/apache-speed-last-modified.html these are in direct opposition - which should be implemented? I'm leaning towards Apache's suggestion, as when I want a file cached, I don't want it refreshed.

    Read the article

  • Is there an app/script I can deploy to enable my users to change their own LDAP passwords?

    - by Tom Wright
    I've recently enabled LDAP based authentication on my domain. This has allowed us to use a single set of credentials to administer the blog, the forum and the wiki. Unfortunately, this has come at the cost of users being able to change their own passwords. Ideally, users would be able to visit a page (i.e. mydomain.com/account), authenticate and then change their password. Does anyone know of a script or app that will allow me to do this quickly and easily? I guess it wouldn't be hard to write in PHP, but I'd prefer not to have the hassle.

    Read the article

  • Make blogger load faster

    - by Wladimir Ivanov
    all. I use blogger as a platform for electronic music blog. Because of the thematics of the blog I embed many iframes (Youtube & Soundcloud). Of course this makes the articles to load slow. Almost each article on this blog consists of some text and many iframes below. What should I do in this particular case in order to make the articles (pages) load faster. Is there any available solution or I should use some jquery like lazy load to load iframes once the scroller reaches them? Any help is greatly appreciated.

    Read the article

  • ASP.NET website deployment [on hold]

    - by Rei Brazilva
    I am getting my hands wet with ASP and I have been following the tutorials. I deployed the site and in Azure and it worked great. Today I started actually designing the site. And when I published, it looks as if it doesn't read any of the files I just updated, added, and modified. It works on my localhost, but not in the Azure. I thought when you publish, everything goes up, including the new files. I don't have enough reputation to add a picture so, you'll forgive me. SO, basically, how do I get my entire site uploaded? In case anyone does stop by, I was able to pull this out just recently: CA0058 Error Running Code Analysis CA0058 : The referenced assembly 'DotNetOpenAuth.AspNet, Version=4.0.0.0, Culture=neutral, PublicKeyToken=2780ccd10d57b246' could not be found. This assembly is required for analysis and was referenced by: C:\Users\lotusms\Desktop\LOTUS MARKETING\ASP.NET\WebsiteManager\WebsiteManager\bin\WebsiteManager.dll, C:\Users\lotusms\Desktop\LOTUS MARKETING\ASP.NET\WebsiteManager\packages\Microsoft.AspNet.WebPages.OAuth.2.0.20710.0\lib\net40\Microsoft.Web.WebPages.OAuth.dll. [Errors and Warnings] (Global) CA0001 Error Running Code Analysis CA0001 : The following error was encountered while reading module 'Microsoft.Web.WebPages.OAuth': Assembly reference cannot be resolved: DotNetOpenAuth.AspNet, Version=4.0.0.0, Culture=neutral, PublicKeyToken=2780ccd10d57b246. [Errors and Warnings] (Global) Could this have something to do with the problem?

    Read the article

  • Load a part of the page without AJAX [migrated]

    - by nachovall
    I'm developing a web site where I want the left menu to stay fix, while the content of the clicked option is loaded. Actually, what I do is that each menu link using AJAX it return the requested content. Everything works fine but I would like to avoid it because then statistics are difficult to follow (among some other things like Google boots). How can I do the same affect/similar (http://www.foundcrelamps.com/) without javascript?

    Read the article

  • Assign subdomains to separate ports on web server

    - by Michael Frank
    I have set up an Abyss web server as a little experiment, and I want to know if it is possible to assign subdomains to different ports on the machine the web server is running on. I have a couple webUIs that I'd like to assign subdomains: 192.168.1.1:8000 becomes example.com/webui1/ 192.168.1.1:8001 becomes example.com/webui2/ The webUIs are available by accessing their ports via example.com:8000. I have tried using a reverse proxy, but it seems that this is only usable on one internal IP at a time. What other options do I have? Answer is good, but my current set up doesn't meet the requirements. Abyss Web Server X2 is required to use Virtual Hosts with Abyss.

    Read the article

  • I can see traffic coming from google ads, even though I don't have any google ads running, how is that possible?

    - by freethinker
    I can see "googleads.g.doubleclick.net/pagead/ads...." as a referrer on my website. However I am not running any google ads. So how am I getting the traffic? Here's the complete URL http://googleads.g.doubleclick.net/pagead/ads?output=html&h=250&slotname=5275434421&w=300&lmt=1311690266&flash=10.1.102&url=http%3A%2F%2Fwww.humbug.in%2Fsuperuser%2Fes%2Fpromiscuous-modo-con-intel-centrino-adelantado-n-6200-agn-tarjeta-inalambrica--214946.html&dt=1311690268347&bpp=3&shv=r20110713&jsv=r20110719&prev_slotnames=7553874535%2C5896307875%2C8590890981&correlator=1311690268052&frm=4&adk=3153418812&ga_vid=1396001617.1311690268&ga_sid=1311690268&ga_hid=558368546&ga_fc=1&u_tz=-180&u_his=1&u_java=0&u_h=1024&u_w=1280&u_ah=960&u_aw=1280&u_cd=24&u_nplug=6&u_nmime=22&biw=1263&bih=851&fu=0&ifi=4&dtd=441&xpc=T5MgG9EVV9&p=http%3A%2F%2Fwww.humbug.in

    Read the article

  • Google Analytics showing more unique visitors than there are pages on an intranet site

    - by DDEX
    I take care of a company intranet and measure the traffic with GA. I am absolutely sure that there are no more than 5000 URLs in our company and it is impossible to check the intranet from outside the company network. Yet when I check the number of Unique Visitors (UV) in the last year GA says there were 36.500 of them. How is that possible? I thought UV should measure each URL only once in the given time period. Could anybody explain how this actually works? Can it be that the cookie trackers expire after some time and are counted more then once?

    Read the article

  • My site is getting popular, bandwidth expensive

    - by ElbertF
    I'm running a (small) image hosting site which has been getting a little bit of traction lately (2,000 to 5,000 visitors a day, spikes up to 10,000). It's hosted on Linode and I've already run out of the bandwidth for this month (300 GB). I've experimented with Amazon S3 but that was costing me $5 a day, Linode currently costs me $30 a month. I get a tiny bit of revenue from ads (Adsense and Black Label Ads) but it's not enough to break even. What should I do? Obviously I'd like to keep running the site but not if it starts costing me a lot of money.

    Read the article

  • Ideas for making money off a website

    - by bradenkeith
    I'm considering acquiring a website that traffics close to 20k a month. It's currently unprofitable, but obviously I would need to change that to justify the cost. It's a support site for a framework similar to CakePHP. A resource that allows people to find plugins for that site. How would you go about expanding this site so that it would be broad enough to raise traffic and how would you make money off of it? Some ideas I've had: 1- BuySellAds.com seem to make ads classy and relevant 2- Team up with other sites in some way (What ways? ... Ecommerce sites that sell plugins (like CodeCanyon), etc) 3- Write tutorials and such to drive more traffic (time consuming, and already heavily saturated for this framework). Are there resources out there that help webmasters in ways like this?

    Read the article

  • Page Spamming via locations

    - by codemonkey
    Hi guys I am new here so please be gentle :) I have created a web page for a small mail order business. The page asks the reader if they are in need of a supplier for products in their "area" and if they have ever been let down by a supplier in that "area" etc. It also lists all the local villages and hamlets around the [area] where they can also supply too. This page is dynamically created and the [area] changes and so do the small towns that are local to the town. The page also contains information on the products so the word count vs town names is not stupid. An example of one of the URL would be www.website.com/1014/Halesowen/ It basically covers the whole of the UK so around 800 main towns with 28,000 local villages. The URL changes, so does the title and h1 tags, also each page is Geo coded for that town. My question really is this a good or bad idea? Is it a black hat technique ? I have been told if I have to ask the question then it probably is but the site does supply to all these areas just as any mail order company does and would like to get listed higher in each town for the products. I have seen this done on a few sites but only with a few targeted towns and not the whole of the UK so I would be really interested in your guys thoughts on this. I would post the URL to the site but as I am new here I am a bit unsure of the rules regarding posting links. The whole site needs a lot of other onsite SEO work doing and I will be doing that over the next few weeks. I look forward to your views on this. p.s. If I am allowed to post the URL without getting into trouble so you can see it someone let me know? Thanks in advance

    Read the article

  • Live chat solutions

    - by Lèse majesté
    What good live chat/live help solutions are available (preferably for use on a site hosted on a LAMP stack and free)? I'm looking for a way to allow our sales and customer service reps to talk directly with visitors to our site. I've looked at phpopenchat, but it looks very unpolished. The only other free live chat app I've come across looked egregious. The aesthetics and UI design alone made me shudder to think what the underlying code might look like. This isn't a critical feature, and it wouldn't be hard to code up myself, so I'm not really looking for commercial software or paid services (unless there's a really compelling reason to use them). I'm just wondering if any other webmasters have come across a satisfactory free/open source solution for providing live customer support on their website. As a side note, live voice chat would also be an option, but it has to be be designed (or customizable) for customer support rather than a public chatroom. Edit: Looking at the responses, it looks like there probably aren't going to be many free solutions for this type of business-oriented chat solution, so feel free to post answers even if they are commercial solutions as long as they're a good value. Also feel free to post any alternate live support solutions (such as the Skype recommendation) that could be in someway integrated with a website. This will give me a good lay of the land for what people are actually using for live support, and I think will be more helpful to others reading this question.

    Read the article

  • Blog not even ranking for exact title match, after domain has been dropped twice [on hold]

    - by Akshay Hallur
    Consider a blog, related to blogging and SEO. The domain has been dropped (expired) 2 times before acquisition. The current owner is the 3rd owner of the domain since 5 months. Blog posts are not ranking, even for exact titles. Google+ or other shares will show up instead of the content. Some blog posts are not even indexed. Let us TAKE that it gets around 7 organic visits / day. Dropped domain, less likely used for spam (WayBack machine (2 Reframed drops) 3 captures since 2004, Don't know whether there was Email spam) (But no manual actions in WMT, so no reconsideration request). What could be the reason for this? How can Google be told that ownership is changed and the domain is now spam-free? Would this domain be salvageble, or does this only change after relocating to another domain?

    Read the article

  • Use Google Analytics to target different sections of a blog

    - by Emily Yao
    I have a blog that targets different regions. The Europe region blog has different sections in different local languages such as English, French and German. I wonder how to track and analyze the different sections. My initial thought is to search the domain URL, but I found it is not a good idea. For example, the URL for the Europe blog is like www.myblog.com/europe. If you click the French section, the URL is like www.myblog.com/europe/language/french. If you click an article in the French section, it is like www.myblog.com/article_name. Notice the article link is not www.myblog.com/language/french/article_name!

    Read the article

  • 302 Redirect causes garbage at end of Wordpress link in Facebook

    - by Joao
    When I try to link my Wordpress blog to Facebook, the url doesn't resolve properly. There's garbage appended at the end and Facebook is not able to retrieve information from the site. Happens in every page, post or main entry. Here's what happens: http://clarissarezende.com.br/ shows up in Facebook as http://clarissarezende.com.br/UPLcS/ (when copy/paste the link) and no information about the site shows up in FB. I'm using Wordpress 3.3.1 with ProPhoto 4. Recently I moved the DNS entry on my ISP. The blog is hosted at clarissarezende.com.br/public_html/blog2 and before the DNS would point to public_html and then I changed it to public_html/blog2. Note that I did not move any Wordpress files. Made the (I think) necessary changes all over Facebook, but still no dice... Any ideas on what can be happening?

    Read the article

  • ASP.NET C# - do you need a separate datasource for each gridview? [closed]

    - by Brian McCarthy
    Do you need a separate datasource for each gridview if each gridview is accessing the same database but different tables in the database? I'm getting an error on AppSettings that says non-invocable member. What is the problem with it? Here's the c# code-behind: protected void Search_Zip_Plan_Age_Button_Click(object sender, EventArgs e) { var _with1 = this.ZipPlan_SqlDataSource; _with1.SelectParameters.Clear(); _with1.ConnectionString = System.Configuration.ConfigurationManager.AppSettings("PriceFinderConnectionString").ToString; _with1.SelectCommand = "ssp_get_zipcode_plan"; _with1.SelectParameters.Add("ZipCode", this.ZipCode.Text); _with1.SelectParameters.Add("PlanCode", this.PlanCode.Text); _with1.SelectParameters.Add("Age", this.Age.Text); _with1.SelectCommandType = SqlDataSourceCommandType.StoredProcedure; _with1.CancelSelectOnNullParameter = false; Search_Results_GridView.DataBind(); } thanks!

    Read the article

  • Which registrar checks the most domains?

    - by Christian W
    When I want a new domain, I usually use GoDaddy to check, and another registrar to register. This is because GoDaddy check my wanted domain against the most TLD's. Are there any other sites/registrars that checks against more TLD's? What I want is to type my wanted second-level domain.. Ex: bobsplace. And then it searches through bobsplace.com bobsplace.net bobsplace.me etc, and reports back to me which is availible or not

    Read the article

  • How to overcome politics of the net (Google translate code refuses to work from a specific region)

    - by Jawad
    According to the FAQ's I am not sure if my question is a ok to ask or will be closed or should I post it in the meta or even I would blame some one for downvoting it. However it is one that has been bugging me since the trouble strated. Let me explain. I have this Web Site. It uses the Google Translate API (Can't post the link, does not open from this region) with the following code. <meta name="google-translate-customization" content="9f841e7780177523-3214ceb76f765f38-gc38c6fe6f9d06436-c"></meta> <script type="text/javascript"> function googleTranslateElementInit() { new google.translate.TranslateElement({pageLanguage: 'en'}, 'google_translate_element'); } </script> <script type="text/javascript" src="http://translate.google.com/translate_a/element.js?cb=googleTranslateElementInit"></script> The problem is since this, it just stopped working. On the site you can see that I had to actually remove the above from here, here, and here while left it here, here, here and here. This is so because the the web site "refuses" to load at all with the pages that have the code (i.e., from this region.) If I use Firefox Stealthy Plugin and open the site in Firefox, It works like a charm without any problems. But with Google Chrome, Apple Safari and Opera Web browser, the site does not load/open at all because of the Google translate. (I know this because If I remove the Google Translate Code, the site works/loads fine) It was one thing to program for "cross browser compatability" and alltogether another to program for "cross region compatability". What can I do to make sure that the site works from anywhere? Do I completely remove the Google Translate code and just have to do without the additional functionality or Do I look for alternatives like this or according to this?

    Read the article

  • Subdomain still times out after being set up a month ago

    - by user8137
    I would like to use the subdomain www.high-res.domain.com to be accessed by external customers with specific permissions to access the site (like FTP). We use Network Solutions to house domain.com. We recently added a new IP address to point to www.high-res.domain.com. I gave the IP address to the company that hosts our website. I pinged www.high-res.domain.com and it points to the correct IP address but still times out. It’s been a few weeks now and when you ping it, it still times out. C:\>ping XXX.XXX.X.XXX Pinging XXX.XXX.X.XXX with 32 bytes of data: Request timed out. Request timed out. Request timed out. Request timed out. Ping statistics for XXX.XXX.X.XXX: Packets: Sent = 4, Received = 0, Lost = 4 (100% loss). tracert times out as well. I even went to DNS tools and a few other sites for checking this and it shows the same thing. I recently went into the DNSmgmt on our server (wink2k3sp1) and created an A record under the DomainDnsZones which translated to a CNAME when you look at it. Under the domain it has two entries, one to the subdomain and the other to the website host. Each has separate IP addresses. Is this correct?

    Read the article

< Previous Page | 124 125 126 127 128 129 130 131 132 133 134 135  | Next Page >