Search Results

Search found 9724 results on 389 pages for 'pro zeck'.

Page 140/389 | < Previous Page | 136 137 138 139 140 141 142 143 144 145 146 147  | Next Page >

  • MediaWiki: how to make DISPLAYTITLE be used in categories listings

    - by Konstantin Boyandin
    The problem: a MediaWiki-driven site uses subpages to build pages hierarchy. When I add something like Page1/Page2/Subpage the exactly above string appears in listings and looks clumsy. I can't efficiently use short subpage title (Subpage in this example), since it can appear in different contexts and could confuse users. I can use DISPLAYTITLE magic word, with proper values of $wgRestrictDisplayTitle and $wgAllowDisplayTitle, to reassign page title and make it show on the page. However, when I look into categories listing this page, I will still see "Page1/Page2/Subpage" instead of the title assigned. Is there a simple way (through 'hack' or via relevant extension) to make the new title appear in every listing as well?

    Read the article

  • ASP.NET C# - do you need a separate datasource for each gridview? [closed]

    - by Brian McCarthy
    Do you need a separate datasource for each gridview if each gridview is accessing the same database but different tables in the database? I'm getting an error on AppSettings that says non-invocable member. What is the problem with it? Here's the c# code-behind: protected void Search_Zip_Plan_Age_Button_Click(object sender, EventArgs e) { var _with1 = this.ZipPlan_SqlDataSource; _with1.SelectParameters.Clear(); _with1.ConnectionString = System.Configuration.ConfigurationManager.AppSettings("PriceFinderConnectionString").ToString; _with1.SelectCommand = "ssp_get_zipcode_plan"; _with1.SelectParameters.Add("ZipCode", this.ZipCode.Text); _with1.SelectParameters.Add("PlanCode", this.PlanCode.Text); _with1.SelectParameters.Add("Age", this.Age.Text); _with1.SelectCommandType = SqlDataSourceCommandType.StoredProcedure; _with1.CancelSelectOnNullParameter = false; Search_Results_GridView.DataBind(); } thanks!

    Read the article

  • PHP accessible shared content between two websites on the same VPS on different domains/IPs

    - by Lee Fentress
    I have two ecommerce websites, selling music digital downloads, on the same VPS, currently using cPanel/WHM (but thinking of switching to Virtualmin). They have separate domains and IPs of course. They both share from the same set of music files, so I have duplicate copies in each website directory, which takes up a lot of disk space. How might I go about sharing the same set of music files across both sites, allowing PHP access, so that it does not break my shopping cart's functionality of serving customers the downloads after they have paid for them? I thought of maybe using symlinks or something, but I don't know if it's possible, or if it would have to somehow circumvent built-in security features of the server. I'm new to VPS management.

    Read the article

  • Disable outbound links without letting others know that

    - by tadoman
    Is there a way I can tell google not to follow external links ( pointing to other sites) without letting other know. I know you can disable outbound links by putting rel=nofollow or something in robots.txt. But that's something others can see as well. I'm just wondering if there's a way to tell google not to follow those links without letting others know that... like a setting in webmaster tools or something similar ( there's definetly one way. I could set an exception in my conf file for my server to check the user agent to be "googlebot" and then serve a different version of robots.txt. So that when a different user would check that link it would return a different robots.txt thant the one served to googlebot. However I'm not too sure google would be too happy about this) Thank you

    Read the article

  • How can I use domain masking without having self referral in Google Analytics

    - by Cdore
    I have one old domain that points to a website's server's ip (let's call it www.oldsite.com). I have a new one, www.newsite.com, that is set up to be forwarded to a specific page on the website. Due to the way the host of newsite.com places the website in a frame, in Google Analystics, the newsite.com is listed as a source rather than the source they were at before hand, causing a self referral. A solution is to edit the code of the iframe as I looked up, but there's no way to really edit the host's masking source code of course. Another solution I did previously was have www.newsite.com point to the address that www.oldsite.come pointed to. It solved the analytics problems, but in exchange, the url masking no longer worked. In the address bar, it came up as www.oldsite.com. Is there a way to make me have url masking and be able to forward to agree with google analytics? The server of the website is hosted on a cloud server, if this is anymore information.

    Read the article

  • Cropping images & SEO

    - by user1181950
    So I have a page with a bunch of images with largely varying sizes. Also the layout of the page is such that the images are all in the shape of square tiles, so just resizing will cause distorted images. What I've been doing previously is when users upload images, I resize and crop them appropriately and display the new image as the thumbnail and load full image when user clicks on it. However, I just realized this is an issue with SEO as google will crawl the thumbnails and stick the thumbnails on Google Images instead of the full images. Is there any way to show a cropped/resized image but have Google Image show the full image? I can do something with css using an enclosing div and overflow:hidden, but I'd imagine the performance on that would be pretty bad. Any suggestions? Thanks! PS. I saw this (Make google index the actual image not the thumbnail), but in my case I have users continuously uploading images, and the database of images is always changing and pretty big (thousands), so sitemap will be pretty unwieldy..

    Read the article

  • Restricting crawler activity to certain directories with robots.txt

    - by neimad
    I would like to use robots.txt to prevent indexing of some parts of my website. I want search engines to index only the / directory and not search inside my controllers. In my robots.txt, I have this: User-Agent: * Disallow: /compagnies/ Disallow: /floors/ Disallow: /spaces/ Disallow: /buildings/ Disallow: /users/ Disallow: / I put this file in /mysite/public. I tested the file with a robots.txt validator and got no errors. However, Google always returns the result of my site. For testing, I added Disallow: /, but again, Google indexed all pages. floors, spaces, buildings, etc. are not physical directories. Is this a bug? How can I work around it?

    Read the article

  • In c-panel mail goes in spam instead of inbox in gmail

    - by Robin Jain
    I have c-panel vps server I have create a domain in the same server but when I sent a mail through webmail to gmail email id it goes into spam. Note--->Mail ip note blacklisted Spf records enable DKIM enable reverse dns are perfect ====================================================================== Email header Information: Delivered-To: [email protected] Received: by 10.143.93.13 with SMTP id v13csp119806wfl; Fri, 6 Jul 2012 08:01:36 -0700 (PDT) Received: by 10.182.52.42 with SMTP id q10mr26133912obo.46.1341586895571; Fri, 06 Jul 2012 08:01:35 -0700 (PDT) Return-Path: <[email protected]> Received: from lakshyacs-u.securehostdns.com ([50.97.147.134]) by mx.google.com with ESMTPS id fx3si18028369obc.144.2012.07.06.08.01.35 (version=TLSv1/SSLv3 cipher=OTHER); Fri, 06 Jul 2012 08:01:35 -0700 (PDT) Received-SPF: pass (google.com: domain of [email protected] designates 50.97.147.134 as permitted sender) client-ip=50.97.147.134; Authentication-Results: mx.google.com; spf=pass (google.com: domain of [email protected] designates 50.97.147.134 as permitted sender) [email protected] Received: from localhost.localdomain ([127.0.0.1]:39016 helo=harishjoshico.com) by lakshyacs-u.securehostdns.com with esmtpa (Exim 4.77) (envelope-from <[email protected]>) id 1SnA2J-0006Nq-05 for [email protected]; Fri, 06 Jul 2012 20:31:35 +0530 Received: from 223.189.14.213 ([223.189.14.213]) (SquirrelMail authenticated user [email protected]) by harishjoshico.com with HTTP; Fri, 6 Jul 2012 20:31:35 +0530 Message-ID: <[email protected]> Date: Fri, 6 Jul 2012 20:31:35 +0530 Subject: ggglkhl From: [email protected] To: [email protected] User-Agent: SquirrelMail/1.4.22 MIME-Version: 1.0 Content-Type: text/plain;charset=iso-8859-1 Content-Transfer-Encoding: 8bit X-Priority: 3 (Normal) Importance: Normal X-AntiAbuse: This header was added to track abuse, please include it with any abuse report X-AntiAbuse: Primary Hostname - lakshyacs-u.securehostdns.com X-AntiAbuse: Original Domain - gmail.com X-AntiAbuse: Originator/Caller UID/GID - [47 12] / [47 12] X-AntiAbuse: Sender Address Domain - harishjoshico.com jhkhl ================================================================

    Read the article

  • What sort of phone numbers are allowed as the WHOIS contact?

    - by billpg
    I'm getting a non-trivial amount of scam phone calls to the phone number contact listed in WHOIS. Could I change it to a premium rate line? If the scammers want to talk to me so much, make them pay for the privilege! Seriously though, are there any restrictions on the type of phone number I can give as my WHOIS contact? Notwithstanding that it is a phone number which can be used to contact the domain holder. In the UK, cell phones are more expensive for the caller to call than land-lines, so I suspect a significant number of people are already listing a "premium rate" phone number.

    Read the article

  • OpenSearchDescriptions good or bad signal in Google's eyes?

    - by JeremyB
    I noticed a site using this tag: <link rel="search" type="application/opensearchdescription+xml" title="XXXXXXXXX" href="http://www.XXXXXXXXXX.com/api/opensearch" /> As I understand it (based on http://www.opensearch.org/Home), this tag is a way of describing search results (so you use it on pages which contain search results) to make it easier for other search engines to understand and use your results. Given that Matt Cutts has said Google generally frowns on "search results within search results" is using this tag a bad idea on a page that you hope to achieve a good ranking in Google?

    Read the article

  • How can I tell GoogleBot that a subdirectory is now a subdomain?

    - by cwd
    I had about a million pages of a catalog indexed under a subdirectory, and now that's moved to a subdomain. GoogleBot is crawling each one of them and getting a 301 redirect to the new location. Even though I have set up the redirect rule in the apache sites-enabled configuration file, (i.e. it's early on when apache does the redirect - PHP is not even getting loaded), even though I have done that, the server isn't handling the load well. GoogleBot is making around 5 requests per second, and on top of my normal traffic that is hiking up the CPU for a few hours at a time. I checked in Webmaster Tools and the corresponding documentation for a way to let Google know that the content had been moved from a subdirectory to a subdomain, but with little luck. Basically the most helpful thing I saw said to just send 301 headers for the new location. How can I tell GoogleBot that a subdirectory is now a subdomain? If that is not an option, how can I more efficiently send 301 redirects out for a particular subdomain? I was thinking perhaps the Nginx server but I'm not sure that I can run both Apache and Nginx side by side on port 80 for different subdomains.

    Read the article

  • GA and Unique visitors again

    - by DDEX
    I take care of a company intranet and measure the traffic with GA. I am absolutely sure that there are no more than 5000 URLs in our company and it is impossible to check the intranet from outside the company network. Yet when I check the number of Unique Visitors (UV) in the last year GA says there were 36.500 of them...How is that possible? I thought UV should measure each URL only once in the given time period. Could anybody explain how this actually works? Can it be that the cookie trackers expire after some time and are counted more then once?

    Read the article

  • Grapeshot crawler ignoring robots.txt

    - by QF_Developer
    Has anyone come across a crawler called Grapeshot? They are hammering the same page repeatedly on our website. I believe they are looking for ad related keywords, based on previous content ad campaigns. The odd thing is we never ran any such campaigns on the page they are so interested in. We do have only a few pages running AdSense, is this what has attracted Grapeshot? I've added the following declaration to my robots.txt, but they don't seem to be honouring it? User-agent: grapeshot Disallow: / Any ideas on how to block this nuisance crawler? I'm starting to think the best way is by setting up IP rules in IIS?

    Read the article

  • Creating an encrypted, web-based proxy

    - by Jason
    I have moved to Asia where my internet connection is censored and I'd like to check my messages from social sites which happen to be blocked. As virtually all proxy servers are blocked in this country, I've decided to attempt to roll my own encrypted proxy server. Please note, the key word here is encrypted—if the sniffer sees anything like f@c3b00k or w:k:p3d:ia travelling down the wire I'm had. I have a website hosted with GoDaddy (Windows with PHP 5.2 & IIS 7). Is there any way I can set up an encrypted proxy through this service? If so, how, and what open source tools are available to use?

    Read the article

  • SH404SEF URLs in Joomla 1.5

    - by Tao Bellamine
    I have two modules to play with urls, the global configuration module and the sh404sef module. The global config is set to "Sef urls: YES" and "mod rewrite enabled: YES" and the sh404sef is set "url optimization: NO". My problem is, even with "Sef urls" set in the global config, my urls still don't seem to be that "user friendly" so I turn on the "Url optimization" using the sh404sef module, and I get better descriptive urls. However, the problem I inherit from doing this is that my dynamically populated chronoforms get messed up (only the chrono forms, other forms are fine); These forms are now showing up at the homepage instead of their own reserved page. Here's an example: Old form "GOOD" url: http://www.mycraftwork.com/index.php?option=com_content&view=article&id=94 New optimized "BAD" URL: http://www.mycraftwork.com/handthrown-pottery/alladin-teapot/index.php?option=com_content&view=article&id=94 Any help would be GREATLY appreciated! I can even turn the sh404sef on and off if some people are interested in seeing the issue LIVE. Thanks!! Tao Bellamine

    Read the article

  • Help me make a cronjob/screen command please?

    - by Josip Gòdly Zirdum
    Hi guys I want to set up a cronjob on reboot to do this cd /home/admin/vivalaminecraft.com && screen -d -m -S mcscreen && mono McMyAdmin.exe The issue is when I execute this it seems to create the screen but doesn't do the mono McMyAdmin.exe in the screen... Is there like a then command ? so it does 1. then 2. then 3. ? Could someone please help out :) So I tried this: so I did this: @reboot screen -dmS minecraft @reboot cd /home/admin/vivalaminecraft.com @reboot mono McMyAdmin.exe It still doesn't work. The screen is created but it doesn't have the mono execution in it I put this in it #!/bin/bash screen -dmS minecraft; cd /home/admin/vivalaminecraft.com; mono McMyAdmin.exe; is this correct?

    Read the article

  • Shopping cart for service providers?

    - by uos??
    From my limited exposure, it seems to me that most shopping cart/eCommerce platforms are specifically for products-based retailers. On several occasions now, I've been asked about ecommerce solutions for service providers. That is, it's basically just a single product with payment but no shipping, and highly configurable "product". Any recommendations for a cost-efficient solution (high feature coverage) for such a web platform? Requirements: .NET No/suppressed product catalog A service customization selection form Payment (probably PayPal with accountless credit card processing) Guest purchases (no site account required) Email confirmation Customer service -facing control panel It's hard to search for such a product because I get "web service based ecommerce software" and so on clouding up the results.

    Read the article

  • Photoshop, slice, Dreamweaver, web?

    - by Omega
    So I am playing around with Photoshop and Dreamweaver. I have created a site layout, and have used the slice function on it. Next, I saved as html & images. In Dreamweaver, I open such html file and I fill the page with content, links, etc. I have a website and everything, and I would like to use my newly created html page on it. But, obviously, if I copy & paste the html to my website it won't work because it will lack the images. But two things: I can't find the images, and apparently they are a lot. I am sure I am doing a great mistake regarding the images. Can someone help me?

    Read the article

  • Page Spamming via locations

    - by codemonkey
    Hi guys I am new here so please be gentle :) I have created a web page for a small mail order business. The page asks the reader if they are in need of a supplier for products in their "area" and if they have ever been let down by a supplier in that "area" etc. It also lists all the local villages and hamlets around the [area] where they can also supply too. This page is dynamically created and the [area] changes and so do the small towns that are local to the town. The page also contains information on the products so the word count vs town names is not stupid. An example of one of the URL would be www.website.com/1014/Halesowen/ It basically covers the whole of the UK so around 800 main towns with 28,000 local villages. The URL changes, so does the title and h1 tags, also each page is Geo coded for that town. My question really is this a good or bad idea? Is it a black hat technique ? I have been told if I have to ask the question then it probably is but the site does supply to all these areas just as any mail order company does and would like to get listed higher in each town for the products. I have seen this done on a few sites but only with a few targeted towns and not the whole of the UK so I would be really interested in your guys thoughts on this. I would post the URL to the site but as I am new here I am a bit unsure of the rules regarding posting links. The whole site needs a lot of other onsite SEO work doing and I will be doing that over the next few weeks. I look forward to your views on this. p.s. If I am allowed to post the URL without getting into trouble so you can see it someone let me know? Thanks in advance

    Read the article

  • Changing website Url - Am I making an SEO mistake

    - by Denis
    I have a webiste with a .com domain that is a year old. The business is a shop based in Ireland and I have purchased a .ie domain. I plan to move the website over to the new domain, SEO Good or Bad idea? Old Url - SmythsOfTerenure.com | New Url - SmythsComputerRepair.ie (I am using Fake names and fictional business in the example Url's) The new domain has my main keyword in it. The old domain has my family name and business location (city district) It currently ranks high for lots of relevant keywords in Google with low traffic and low competition. Current website traffic is about 80 session per week. 80% of that traffic is Organic from Google. I am changing domain in an attempt to help SEO long term by having a CC TLD (.ie rather than .com) and having my main Keyword in the domain. I plan to do 301 re-directs from old to new and update GW Tools and G Analytics but am I making a mistake changing it at all as I know rankings may fall in the sort term. Homepage PR=0 and very few inbound links. Should I just leave it on the old domain? Or after a few months should I be back up ranking as well as I am now?

    Read the article

  • Multilingual website without language component in the URL

    - by user359650
    I'm working on a website for Canada which will have French and English versions. For SEO purposes, I would like to avoid using any language tag in URLs because I believe it will have more impact (e.g. example.ca/products better than en.example.ca/products or example.ca/en/products). I believe this is technically possible because the2 languages are sufficiently different that the URLs won't be conflicting with one another (e.g. if you want a "product" page, it will be /products in English, and /produits in French so you know which language the URL is about). Since Google (and most likely others) doesn't rely on the URL (nor HTML tags) to determine the content language I don't see any problems with search engines. To make this possible I've thought about using a cookie distinct from the session cookie (e.g. example.org_language) with long term expiry (e.g. N years) that will memorize the language chosen by the user. That way when people visit the website with a new browser session, they get served the proper language. I have already given up on users being able to switch one page from English to French: when people will chose English or French from the menu they will be redirected to the corresponding version of the home page. Do you foresee any problems with not using a language component in the URL (whether domain or path)? (as long as one makes sure URLS don't conflict).

    Read the article

  • How to open the console in different browsers?

    - by Šime Vidas
    Chrome: Press CTRL + SHIFT + I to open the Developer Tools. Click on the "Open console." icon in the bottom left corner. IE9: Press F12 to open the developer tools. Open the Script tab, click the "Console" button on the right. Firefox 4: Press CTRL + SHIFT + K to open the Web console. What about Opera 11 and Safari 5? Clarification: By console I mean the JavaScript console that lets you input and execute JavaScript code.

    Read the article

  • Resize content for "frame" on .aspx page which draws product content from suppliers' page

    - by zenbike
    I have a retail store site, no online sales, which displays the webpage of our supplier in a "frame" in order to have the most accurate and up to date information for our customers. (example) My issue is that the size of the page it is pulling in doesn't fit in the frame. It looks pretty poor, and part of the content is obscured. Is there a way to scale the content drawn in to the size of the frame? The same site also has an intermittent issue with the Flash banner loading. When it doesn't load, the layout of the header on the page is awful. Any ideas there will also be appreciated.

    Read the article

  • How to easily delete all email forwarders in cPanel?

    - by psoft
    I know that I can import a list of email forwarders using CPanel, but how can I delete a list? I want to manage 300+ addresses - as a membership list for my organization. I want to be able to delete that many without clicking 'Delete' and then 'Confirm' (or whatever it is) 300 times. Even if I am able to simply delete ALL forwarders, then upload a modified list - that's fine with me. Note: I'm using a shared hosting package through SiteGround. The tech service rep informed me that I can't use CPanel scripts in a shared environment. Any suggestions?

    Read the article

  • Google's process for publishing/modifying pages [closed]

    - by Glenn Dayton
    I'm assuming that a group of people at Google have control of certain sections of google.com, but how does Google make sure that employees don't accidentally or intentionally sabotage the website? Does Google use Adobe Contribute or some similar product for sharing/publishing the website. Do employees use WebDAV, FTP, SFTP, or SSH to publish the site. Since Google has hundreds of thousands of servers it probably takes some time for its servers to update. Do they transmit the new copy of the website to all servers before publishing at once? This question does not apply to Google editing a database and having a page reflect the database's changes. It applies to employees editing the source code and/ or back end of the site.

    Read the article

< Previous Page | 136 137 138 139 140 141 142 143 144 145 146 147  | Next Page >