Search Results

Search found 24735 results on 990 pages for 'site ranking'.

Page 169/990 | < Previous Page | 165 166 167 168 169 170 171 172 173 174 175 176  | Next Page >

  • Common reasons for the &lsquo;Sys is undefined&rsquo; error in ASP.NET Ajax applications

      In this blog I will try to summarize the most common reasons for getting the famous 'Sys is undefined' error when running an Ajax enabled web site or application (there are almost one milion results on Google for that phrase). Where does it come from? In every Ajax web pages source you will see a code like this: <script type="text/javascript"> //<![CDATA[ Sys.WebForms.PageRequestManager._initialize('ScriptManager1', document.getElementById('form1')); Sys.WebForms.PageRequestManager.getInstance()._updateControls([], [], [], 90); //]]> </script>   This is the initialization script of the ScriptManager. So, if for some reason the Sys namespace is not available when the code executes you get the Sys is undefined error. Here are the most common reasons and solutions for that problem:   1. The error occurs when you have added a control from RadControls for ASP.NET AJAX, but your application is not configured to use ASP.NET AJAX. For example, in VS 2005 you created a new Blank Site instead of a new Ajax-Enabled Web Site and the Sys is undefined message pops up. To fix it you need to follow the steps described at Configuring ASP.NET Ajax article (check the topic called Adding ASP.NET AJAX Configuration Elements to an Existing Web Site) or simply create the Ajax-Enabled Web Site. You can also check my other blog post on the matter: Visual Studio 2008: Where is the new ASP.NET Ajax-Enabled Web Site template?   2. Authentication - as the website denies access to all pages to unauthorized users, access to the Telerik.Web.UI.WebResource.axd handler is unauthorized (this is the default handler of RadScriptManager). This causes the handler to serve the content of the login page instead of the combined scripts, hence the error. To solve it - add a <location> section to the application configuration file to allow access to Telerik.Web.UI.WebResource.axd to all users, like: <configuration> ... <location path="Telerik.Web.UI.WebResource.axd"> <system.web> <authorization> <allow users="*"/> </authorization> </system.web> </location> ... </configuration>   Note that the access to the standard ScriptResource.axd and WebResource.axd is automatically allowed for all users (authenticated and unauthenticated), so if you use the ScriptManager instead of RadScriptManager - you will not face this problem. The authentication problem does not manifest when you disable script combining or use the CDN. Adding the above configuration section will make it work with RadScriptManagers combined script.   3. The IE6 browser fails to load the compressed script. The problem does not appear in any other browser. There is a well known bug in the older versions of IE6 which lose the first 2,048 bytes of data that are sent back from a Web server that uses HTTP compression. Latest versions of RadScriptManager does not compress the output at all if the client is IE6, but in the previous versions you need to manually disable the output compression to prevent the error. So, if you get the Sys is undefined error in IE6 - update to the latest version of RadControls or simply disable the output compression.   4. Requests to the *.axd files returns Error Code 404 - Not Found. This could  be fixed easily: Check in the IIS management console that the .axd extension (the default HTTP handler extension) is allowed:     Also check if the Verify if file exists checkbox is unchecked (click on the Edit button appearing in the previous screenshot to check). More information can be found in our troubleshooting article and from the ASP.NET QA team blog post   5. The virtual directory in IIS is not marked as Web Application. Converting it to Web Application should fix the problem.   6. Check for the <xhtmlConformance mode="Legacy"/> option in your web.config and remove it. It would be rather rare to become a victim of this exact case, but still have it in mind. Scott Guthrie describes it in more details   In the above points I mentioned several times the terms web resources, javascript output, compressed script. If you want to find out more about these please see the Web Resources Demystified series of my friend and colleague Atanas Korchev   I hope that one of the above solutions will help you get rid of the Sys is undefined error.   Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • How to properly URL/domain forward

    - by NRGdallas
    No clue on a title for this, someone feel free to suggest an edit. I have a client that has a website. He owns around 200 domains, and wants each domain to contain content from the main website. The header, footer, and navigation bars will remain the same for each domain, but the actual page content will vary (obviously duplicate content issues, open to suggestions) He wants each individual page to be its own separate domain, rather than a url within the main domain. (page1.com page2.com etc - NOT site.com/page1.html, however the file is actually hosted at site.com/page1.html - all links will direct to site.com/whatever accordingly) What would be the best place to start reading / learning on how to do this, and what concerns/considerations should be taken into mind?

    Read the article

  • The danger of changing the domain of your portfolio

    - by Mervin
    So I have a online portfolio that is available at mervin-ux-portfolio.com but I am planning to change hosts since the current host I am hosting it with is hitting me with a very high yearly renewal rate. When I was inquiring about domain transfers ,,they told me that since I had not initiated the domain transfer within 14 days of the expiry of the domain ,they cannot do it immediately and it would take about two weeks to to release the domain name. Since I dont like the idea of my site being down for like 2 weeks ,I was wondering if I should start afresh with a new domain on a new host and what were the potential dangers of that ( I have the entire site backup,so creating a replica of the site on the new host wont be hard) I also wont be losing any business or work since I work full time currently but I was just wondering about the challenges in terms of getting my domain name back to the top of search results and basically getting it out there assuming I go the new domain name approach. I know this is strictly not an UX question but I was hoping people could give some suggestions on what I should do

    Read the article

  • Information about links disappeared from Webmaster Tools

    - by Bobrovsky
    I discovered that all information about links to my site disappeared from Google Webmaster Tools. Last time I checked the "Links to your site" page in GWT there was nice list of linking domains and all. But now there is only "No data available." There were no changes to the site contents. Why could it be? And what can I do to fix this? About a month earlier I found that PR of all my pages dropped by 2 points. May these changes be related?

    Read the article

  • Best Option for Creating A Small Church Website

    - by Jim
    I've been asked to create a website for a small church. Their prior site was hosted on geocities which is no longer. They are not looking for anything robust, just an informational site with a calendar and maybe a contact form. The church would also like to be able to administer the site with little technical know-how. Cost is also an issue. Given these requirements, something like sites.google.com seems like a good option. However, my main concern is that Sites will suffer the same fate as geocities. It is definitely not a flagship Google product. Are there other good alternatives that fit the requirements?

    Read the article

  • Online iPad 1&2 emulators give different results compared to the real thing

    - by Systembolaget
    I'm designing a centered website (jQuery Isotope). Thre sandbox is here. I have used some online iPad 1&2 emulators to test how the site is viewed on these devices. Then, I managed to get hold of the real thing. Result: on real iPads, the site is centered and the layout adjusts automatically as expected. In online iPad emulators, the site is not quite centered and additional Isotope elements are squeezed in. Of course, I trust the real thing more than online emulators, but why is this happening? To me, it feels like website testing with online emulators is not so reliable after all? If this question is wrong here, please move it or tell me where it should go. SO is about programming, this question isn't. Thanks!

    Read the article

  • New website, plans to go large. How do you protect yourself?

    - by John Redyns
    I'm planning to create a new site that (in hopes) will make it to a decent state of popularity and use. I have made sites before, but they weren't serious, with any intended purpose other than personal and friend use. I've never been able to find a solid post on good steps to protecting yourself, and your site/idea before you start. This site will always be free, and will not be bringing it any revenue by ads or whatnot, but I plan to in the future and would want to make I'm in the clear legally for one. Do you need to copyright anything? Or anything of the same concept as copyright? Do I make an LLC to operate it under? Apologies for this extremely poorly written question, basically I want to be both legal, and I want to make sure nobody can just rip my idea or name(s). (I'm sure this will be more concise as questions here are asked) Thanks

    Read the article

  • Tracking referrals between profiles on the same domain in Google Analytics

    - by doctororange
    I have a website at mydomain.com that uses Analytics. I have a blog that resides at mydomain.com/blog/, which also uses Analytics They are on different profiles. The main site uses something like: _gaq.push(['_setAccount', 'UA-XXXXXXXX-6']); While the blog uses: _gaq.push(['_setAccount', 'UA-XXXXXXXX-7']); _gaq.push(['_setCookiePath', '/blog/']); My issues is that this seems not to track referrals from the blog through to the main site when, for instance, the logo which links to the main site is clicked. Ideally, I would like the clicks of this logo to report that the source was mydomain.com/blog/, but because they are at the same domain they seem to register as direct traffic. Have I missed a step in my configuration, or will I have to resort to linking to something like mydomain.com?ref=blog? Thank you.

    Read the article

  • E-commerce + CMS: 2 sites or one?

    - by Guandalino
    Ok, let's say that a customer already has a CMS managed web site but now wants to sell goods online using an E-commerce platform (Magento in this case). My question is, does it make any difference between choosing to have just one site running both CMS and E-commerce (www.mycompany.com, or to have one site for the CMS (www.mycompany.com) and another (www.mycompany-shop.com) for E-commerce? I'd like to know the pros and cons of these approaches, so that I can advice the customer for the best. --EDIT I forgot to say that I'd prefer to have 2 separated web sites. This way I shouldn't have to learn how to integrate them together (one in Python, the other in PHP).

    Read the article

  • How to point DNS records to Amazon AWS Elastic Beanstalk

    - by Lance
    I just created a new environment with AWS and uploaded a .zip file into elastic beanstalk. I used to have a friend's server host my site instead of GoDaddy so I changed my custom DNS name servers from pointing to my friend's server to GoDaddy's. AWS told me that I need I need to add a CNAME record on GoDaddy and I did. The alias is lance and the host is lance-env.elasticbeanstalk.com. I know that this change can take 24-48 hours to take effect but it's been a day already and when I go to my site, a default page from GoDaddy appears. I'm very new to AWS and would just like to find a way for AWS to host my site other than using Route53.

    Read the article

  • [News] Coup de pouce ? la communaut? .NET belge

    Petit coup de pouce ? un nouveau venu dans la communaut? .NET, le site DotNetHub. Un site .NET qui cible principalement la Belgique francophone, le Luxembourg, la France et la Suisse. Pour Steve Degosserie co-cr?ateur du site : "DotNetHub est un endroit o? vous pourrez trouver toute une s?rie d?informations via des news, blogging, articles, des supports de conf?rences ou encore des podcasts". Longue vie ? DotNetHub !

    Read the article

  • Why is Google still not indexing my !# website?

    - by Zubair
    I have been working on a website which uses #! (2minutecv.com), but even after 6 weeks of the site up and running and conforming to the Google hash bang guidelines stated here, you can still see that Google still hasn't indexed the site yet. For example if you use Google to search for 2MinuteCV.com benefits it does not find this page which is referenced from the homepage. Can anyone tell me why Google isn't indexing this website? Update: Thanks for al lthe help with this answer. So just to make sure I understand what is wrong. According to the answers Google never actually indexes the pages after the Javascript has run. I need to create a "shadow site" which google indexes (which google calls HTNL snapshots). If I am right in thinking this then I can pick a winner for the bounty

    Read the article

  • What's the difference between my nameserver and CName settings?

    - by Josh Mcquiston
    I have purchased a domain name(mxsoup.net) through GoDaddy, and it is just parked. In order to set up my custom URL for my SourceAudio site, they give me the following instructions: In order to host your site at a your own URL, we need you to set up some DNS records to point your URL to us. Specifically, we need two CNAME references, one for 'www.mxsoup.net' and one for 'secure.mxsoup.net', both of which should point to 'web2.sourceaudio.com'. But the rep on the phone at GoDaddy said that my site is hosted at HostMonster.com, and therefore I need to talk to them to accomplish this(which is possibly true, but my business owner says he hasn't purchased hosting for this particular domain, yet he does have some other sites in his hostmonster hosting acct.) My GoDaddy account shows that my nameservers are pointing at NS1.HOSTMONSTER.COM, and NS2.HOSTMONSTER.COM, and I can edit those. But is this the same as setting up the CNAME as described above? Any help would be greatly appreciated!

    Read the article

  • guest blogging, recipricle links & nofollow

    - by sam
    When writing a guest blog for a site and in the blog post i write i link back to my self, that counts as an imbound link. Writing something on my blog, like "have a look at this post i wrote for _" made that a link the links would be recipricle, correct ? thus cancling each other out.. If i was to to make the link back to my article a nofollow link then would i still get the link juice ? If i write guest blog post and the site want to also write a guest blog on my site later on whats the best way to handle it as wont these both cancel each other out and have no effect ?

    Read the article

  • Adsense link units trageting better keywords?

    - by user1010609
    I have one link unit above my navigation which targets exact keywords on what my site is about, on contrary my adsense for content Leadbord and Wide Skyscraper show ads that are totally opposite from my keyword,while my ctr increased for about 10 times with link unit I noticed cpc is lower also (but that was just based on one day, today is higher). Is it so bad to completly get rid of standard content ads and put link units, because this way I earn so little for that site but I think that is because smart pricing (accidental clicks on ads that my users are not interested)? I earn in whole day with about 25-35 clicks as much as keyword tool shows for only one click,my site is first for the topic? I really don't know what to do.. Has anyone had similar situation or can give some advice?

    Read the article

  • Joomla Backend running slow on localhost

    - by boothe
    I made a local backup of a Joomla site a few months ago to test changes before updating the live site. Everything worked fine. Today I checked the local version after a while but when I open the backend (/administrator) it takes a while until the site is loaded. I tried out different things and accidently disconnected my Network Connection. After that everything loads as fast as before. But when I connect the Network Connection the problem reappears. I am running Joomla 1.5.14 on XAMPP 1.7.0.

    Read the article

  • Most effective marketing strategy to promote a casual iOS game?

    - by user1114968
    So I posted this on another forum yesterday but that forum got suspended for malware so gotta wait for the webmaster to fix the site. Here's the basics: We've released a press release through PRMac that included a video review. Submitted and followed up on all the big iOS review sites. None of them replied back with interest. A lot of them just told me that their editors are volunteers who will review games that are "interesting to their readers" and that they would put my app "into consideration" The only site that reviewed our app and promoted virally was iPhoneAppReview.com which we paid. We promoted on the top iOS forums We are now doing in-app advertising through inMobi and are integrating the SDK code into our app to start doing Tapjoy We posted up our gameplay videos on YouTube Any marketing strategies that anyone can suggest or recommend that we haven't used yet? If anyone wants to try out our game and give feedback on the game or the site or anything, that would be great! Our target countries are Japan, China, and the US.

    Read the article

  • How to make content display on their proper pages on search engines

    - by Dendory
    So my site has a blog and gallery, both working the same way. There's an index, and each post has a permalink going to the individual entry. However, if I search for some of the content on Google, often it returns a link to the index, just because it happened to have been on the first page when it was crawled, instead of the individual post pages. This is especially true in cases of images. How can I make it so that Google returns the proper pages for the posts instead of just the main page of my site? My whole site is custom php code I made.

    Read the article

  • htaccess 301 redirect for payment page

    - by Chris Robinson
    I have a client who currently runs a venue and has ticket purchases made available through a third party. The way the site currently works is that there is a standard href in the nav menu to the ticket purchasing site. <a href'http://example.com/events'>Events</a> <a href'http://example.com/about'>About</a> <a href'https://someticketvendor.com/myclient?blah'>Tickets</a> They claim that they want to improve their SEO by appearing to integrate the ticket pages into their site. Having spoken to the ticket vendor, they only offer integration through iframes which is just horrible. I don't really know much about SEO but I'm wondering if I can create an htaccess rule to have http://example.com/tickets forward to href'https://someticketvendor.com/myclient?blah Are they are any negative SEO implications to doing this? Is there a better way this could be done?

    Read the article

  • robots.txt, how effective is it and how long does it take?

    - by Stefan
    We recently updated the site to a single page site using jQuery to slide between "pages". So we now have only index.php. When you search the company on engines such as Google, you get the site and a listing of its sub pages which now lead to outdated pages. Our plan doesn't allow us to edit the .htaccess and the old pages are .html docs so I cannot use PHP redirects either. So if I put in place a robots.txt telling the engines to not crawl beyond index.php, how effective will this be in preventing/removing crawled sub pages. And rough guess, how long before the search engines would update?

    Read the article

  • Images Loading Very Slowly

    - by Vecta
    I'm currently working on optimizing my site to try to decrease load time by using Pingdom tools. I seem to be having some difficulty with long load times on images. For example, the body background for my site is a 29kb file but takes almost 500 ms to load, the majority of which is spent connecting to the server. This one seems to take the longest times but other images seem to take a lot of time as well—the majority of which seems to be spent connecting to the server. This also seems to fluctuate as I've seen the same image load in 500ms one minute and ten minutes later load in 1.5 seconds. My site is using the Modx CMS but I'm not sure if that would affect this at all. Is it more likely that this is a server issue? Is there anything that I should check or do to help alleviate these inflated 'connect' times?

    Read the article

  • I've changed my URL schema. How do I tell Google to index the new schema and forget the old one?

    - by growse
    I had a site where the urls were constructed like this /index.php/Topic /index.php/AnotherTopic These were indexed in google, and search results returned that pointed to these. However, I've recently replatformed that site, and reconfigured it so the above urls would be: /index.php?title=Topic /index.php?title=AnotherTopic The original urls are returning 404s. The site is linking to the correct URL schema internally, but Google is retaining the original schema in its search results. I've updated and resubmitted the sitemap which only contains the new schema. Also, Google's webmasters tool is going slightly bananas at the fact there's now a spike in 404 errors in its crawl results. What would be the best approach to get Google to 'forget' about the old schema, and instead index the new schema? Should I try blocking /index.php/ in robots.txt? Should I be returning 301 codes instead of 404 for the original urls?

    Read the article

  • Issues With IIS Hosting Two Domains From Same Folder [closed]

    - by Bob Mc
    I have two different domain names that resolve to the same ASP.Net site. Both domains are hosted on the same server, which runs Windows Server 2003 and IIS6. The sites are differentiated in IIS Manager using host headers. However, both of the sites point to the same folder on the local drive for the site's page files. I am occasionally experiencing an ASP.Net error that says "The state information is invalid for this page and might be corrupted." I'm the site developer so I've addressed all the relevant code-related causes for this issue. However, I was wondering whether having two domains/sites sharing the same folder for an ASP.Net application might be causing this intermittent error. Also, is this generally a bad practice? Should I make separate, duplicate folders for each of the domains? Seems like that can become a maintenance headache.

    Read the article

  • Abandoment to blame?

    - by Larsenal
    I have a code snippet for an app that users are loading as a 3rd party script on their site. The general sequence is as follows: Site loads "http://www.example.com/foo.js" foo.js does stuff 1 to 2 seconds later, foo.js loads bar.js Now in a perfect world, I'd want to see matching counts for the calls to foo.js and bar.js. However, bar.js loads only about 94% of the time. I'm wondering how much of this discrepancy might be attributable to site abandonment given the fact that bar.js is delayed by 1 or 2 seconds. I posted here instead of StackOverflow since I think it's more a question about what would be typical time on page when users abandon the page.

    Read the article

  • MammothVPS launches Backups, DNS Management and more!!!

    - by stefan.sedich
    Yesterday we launched a bunch of new features over at MammothVPS, - All VPS' now have an on-site, off-server backup facility available to them. By default all customers will have 1 free weekly backup made available to them, and should you wish to you can purchase more slots which are available in daily, weekly and monthly schedules. - DNS hosting has been made available and will be free for all customers. You can find the new interface in your mPanel. - A cleanup of the menu system has been done to make it easier to navigate around both the site and mPanel. - You will find new sections on site, we now have more information about our services and have included things like a Knowledge Base, which will provide information on howto setup various applications on your VPS. - Added the ability to change the kernel your VPS is running on. So head on over to MammothVPS and check it out.

    Read the article

< Previous Page | 165 166 167 168 169 170 171 172 173 174 175 176  | Next Page >