Search Results

Search found 25629 results on 1026 pages for 'site maintenance'.

Page 243/1026 | < Previous Page | 239 240 241 242 243 244 245 246 247 248 249 250  | Next Page >

  • Permanent redirect to different domain followed by temporary redirect to folder

    - by Ricardo Amaral
    I have old-domain.com which I want to migrate to new-domain.com. However, the content on the old domain is, well, old. And I'm currently in the process of redesigning my whole site. My idea is to do a permanent (301) redirect from old-domain.com to new-domain.com so that search engines know about the new domain and forget about the old one. But since the content is old I was thinking to do a temporary (302) redirect from new-domain.com to new-domain.com/old/ until the new content/site is ready to be published. Is this, for some reason, a bad idea? Or there's nothing wrong with it? One last thing... If I go with this, what should I do when the new content is ready? Should I just remove the 302 redirect and that's it, or should I do something else to notify search engines that the temporary redirect is over?

    Read the article

  • Distributed Transaction Framework across webservices

    - by John Petrak
    I am designing a new system that has one central web service and several site web services which are spread across the country and some overseas. It has some data that must be identical on all sites. So my plan is to maintain that data in the central web service and then "sync" the data to sites. This includes inserts, edits and deletes. I see a problem when deleting, if one site has used the record, then I need to undo the delete that has happened on the other servers. This lead me to idea that I need some sort of transaction system that can work across different web servers. Before I design one from scratch, I would like to know if anyone has come across this sort of problem and if there are any frame works or even design patterns that might aid me?

    Read the article

  • Set IP address to point to certain domain

    - by silvercover
    I have a Linux VPS, DirectAdmin as web panel and already set a domain to it. everything is OK and I can see my website in my browser using domain name. Now I need to have access to my site using its IP address. something like http://86.57.88.29, but when I try to load my site in a browser using its IP I get below message and I have to post-fix my IP with /~admin (http://86.57.88.29/~admin) to get it work. This IP is being shared among many domains. To view the domain you are looking for, simply enter the domain name in the location bar of your web browser. So how can I configure my IP to point to my public_html folder without and ~admin like phrase? Thanks.

    Read the article

  • How can I work on a WordPress theme already installed in the root directory?

    - by Isaac Lubow
    I have WordPress installed at the root level of a website. I thought it would be easy enough to have a "coming soon" page called default.html and edit the .htaccess file as follows: AddHandler php5-script .php DirectoryIndex default.html index.php # BEGIN WordPress # END WordPress ...so that visitors to the site are sent to the default page, and I could manually specify index.php as my destination for testing. (This isn't a high-security job.) But index.php is redirecting me to the default page. When I remove the DirectoryIndex line, the index.php file is found automatically by visitors to the site root, but... that's the page I was trying to hide. What am I doing wrong with .htaccess and how can I get it to behave the way I want?

    Read the article

  • How to Estimate Needed Bandwidth for New Web Application?

    - by Noah Goodrich
    I am working on a brand new SaaS web application and need to estimate the initial bandwidth usage. Since the site doesn't exist yet, and since this is my first endeavor of this sort, I'm not really sure how much bandwidth to estimate to begin with. We will be using Linux, Apache, PHP and Mysql. The content will be generated dynamically. There will be images as part of the site design but user's will also upload images that will be displayed and documents that will be stored for download at later times. We'd like to be able to support 500,000 page loads per month with estimated image loads being about two to three times that.

    Read the article

  • Do Not Track Plus Stops Web Sites from Tracking You

    - by Jason Fitzpatrick
    Do Not Track Plus is a Firefox extension that combines the do-non’t-track header with protection lists for comprehensive tracking avoidance while surfing the web. Unlike all-or-nothing no tracking flags, the Do Not Track Plus extension for Firefox allows you to set white and black lists for websites you would prefer to be tracked or not tracked by. You may, for example, want a shopping site you get benefits from or a news site that gives you customized articles to be allowed to track you. The tool also preserves anti-tracking cookies even when you wipe the rest of the cookies in your browser’s cache; effectively stopping you from accidentally rescinding your opt out cookies from anti-tracking sites. Do Not Track Plus [Abine via Wired] How to Enable Google Chrome’s Secret Gold IconHTG Explains: What’s the Difference Between the Windows 7 HomeGroups and XP-style Networking?Internet Explorer 9 Released: Here’s What You Need To Know

    Read the article

  • How to dump a MediaWiki for offline use?

    - by Sandra Schlichting
    I would like to be able to make an offline version of a MediaWiki site on a weekly basis. The DumpHTML extension actually does what I want, as it dumps all articles and media files, but I can't see any index of all the articles it have dumped, so I can't navigate in the dump. Reading about the XML dump feature MediaWiki have, I wonder if it would be possible to either use a program to view these files or perhaps convert them to html? Or are there other ways to make an offline version of a MediaWiki site?

    Read the article

  • How much a website like bytes.com earning

    - by robin das
    i am running a website similar to bytes.com (IT QnA site) my site attract 600 unique visitors daily. we are planning our self as bytes.com. Atleast any one can tell me that whether we can earn some serious money with this kind of website. websiteoutlook estimate Daily Pageview - 700636 to bytes.com and its using google ads. Can any one please let me know what kind of earning we can expect for dotcom like bytes.com. Please give some light on this topic as a lot of energy, time and money goes in building this kind of website thanks robin Das

    Read the article

  • How to monitor outgoing server activity to detect malware?

    - by ted.strauss
    I have a website that has previously been victim of malware. I restored the site from an old backup and have made every effort to lock down the server. I have no way to be absolutely certain that the backup I used is clean, and I'm worried that this malware may re-appear. I would like to use a tool to monitor outgoing port activity to detect signs of malware activity. Unfortunately I'm using a server host that does not give me shell access, so I need to use a tool that can be installed via FTP and used via the browser. My site is Joomla :( so a Joomla extension with this capability would work, but I haven't found that yet. Any suggestions. Many thanks

    Read the article

  • MySQL port forwarding

    - by Eduard Luca
    I am trying to help a colleague to connect to my MySQL server. However the situation is a bit special, and here's why (let's call him person A and me, person B): Person A has a PC, on which he has a virtual machine, which is in the same network as the actual PC he's running. However person A is also in the same network with person B (a different network). I want the site that lives on A's VM to be able to connect to the MySQL server on B's PC. For this I've thought a port forwarding would be appropriate: from ip-of-person-A:3306 to ip-of-person-B:3306. This way the site would connect to the IP of the PC it's living on (not the VM), which would forward to A's MySQL. I've seen several examples of port forwarding, but I don't think it's what I need, from what I've seen, it's kind of the opposite. So would something like this be achievable?

    Read the article

  • How to create SharePoint2013 workflow using visual studio

    - by ybbest
    If you like to use Visual Studio to create workflow in SharePoint2013, here are the steps on how to get started. 1. Create a SharePoint sandbox solution. 2. Add a list workflow 3. I add a WriteToHistory to the workflow. 4. Here is the final solution looks like: 5. Deploy the sandbox solution to your Office 365 Preview and activate the site collection feature first 6. Then you can activate the site features in the following orders 7. You can run your work as shown below 8. Navigate to your workflow history list, you will see the workflow is successfully completed. You can download the solution here.

    Read the article

  • Current iOS version/device statistics?

    - by hotpaw2
    The answer to this SO question has become stale: iOS version/device statistics - where can i find? because time currency wasn't part of that question, and iOS version updates have been release since this question was asked. Is there a web site or other publicly available source which keeps a current or frequently updated list of the percentages of iOS devices and OS versions in use, perhaps by continual monitoring of app analytics or web site logs or other means? And what device or OS information are iOS app analytics currently allowed to report, if any? (...assuming an appropriate privacy policy and adhering to such, of course.)

    Read the article

  • I want to consolidate two sites into a third. Will my search engine rankings be penalized if I rewrite and redirect pages one by one?

    - by Patrick Kenny
    I have two Drupal sites with different content-- let's call them Apple and Orange. I recently developed a much more sophisticated third Drupal site-- let's call it Tree. For a large number of reasons, the content on Apple and Orange is useful for the users of Tree, so I want to move the content to Tree. However, much of the content is out of date. (This whole process took about five years.) To update the content, I will rewrite it one article at a time myself. Now here's my question: if I move the articles one by one (as I rewrite them) and then redirect the old articles (using a 301 redirect) on Apple/Orange to the new site on Tree, will this have a huge negative effect on my search engine rankings? Is there a good way to redirect among sites when they merge like this, or would I be better off keeping the old articles on Apple/Orange and simply linking them to the new, rewritten articles on Tree?

    Read the article

  • jQuery with SharePoint solutions

    - by KunaalKapoor
    For me jQuery is the 'Plan-B' for everything.And most of my projects include the use of jQuery for something or the other, so I decided to write a small note on what works best while using jQuery along with SharePoint.I prefer to use the jQuery JavaScript library, which is far more robust, easier to use, and allows for plugins. Follow the steps below to add jQuery to your master page. For office 365, the prefered location to add jQuery files is the "Site Asserts" library.Deployment Best PracticesThey are only as good as the context it’s being referenced.  In other words, take into account your world before applying it.Script your deployment options.  Folder in SPD. Use the file system.  Make external references.  The JQuery library is on the Microsoft Ajax Content Delivery Network. You may even choose to publish to and from the document library. (pros and cons to this approach)Reference options when referencing the script.ScriptLink will make sure it’s loaded at the top of the page and only loaded once. You need Visual Studio or SPDContent Editor Web Part (CEWP).  Drop it on the page and it’s there.  Easy but dangerousCustom Actions. Great for global deployments of JQuery.  Loads it on every page. It also works in Sandbox installations.Deployment Maintenance Dont’sDon’t add scripts directly to your Master Page. That’s way too much effort because the pages are hard to maintain.Don’t add scripts directly to the CEWP.  Use a content link instead. That will allow for reuse. If you or someone deletes the CEWP you won’t lose code in the web partSecurity.  Any scripts run with the same privileges of the current user.  In other words, you can’t get in trouble.Development Best PracticesDon’t abuse the DOM.  There are better options to load the DOM without hitting it 1,000 times.User other performance boosters.Try other libraries.  Try some custom codeAvoid String conversionMinify your filesUse CAML to reduce number of returns rowsOnly update your JQuery library AFTER RIGOROUS REGRESSION TESTINGCRUD operations can come with some funSP Services wraps SharePoint’s web services for executionThe Bing SDK is pretty easy to use.  You can add it to your page with a script,  put it into a content editor web part and connect it from the address parameters in a list.Steps:1. Go to jquery.com and download the latest jQuery library to your desktop. You want to get the compressed production version, not the development version.2. Open SharePoint Designer (SPD) and connect to the root level of your site's site collection.In SPD, open the "Style Library" folder. Create a folder named "Scripts" inside of the Style Library. Drag the jQuery library JavaScript file from your desktop into the Scripts folder.In the Scripts folder, create a new JavaScript file and name it (e.g. "actions.js").3. If you are using visual studio add a folder for js, you can create a new folder at the root level or if you prefer more cleaner solutions like me, you can use the layouts folder which cleans out on deactivation/uninstall.4. Within the <head> tag of the master page, add a script reference to the jQuery library just above the content place holder named "PlaceHolderAdditonalPageHead" (and above your custom CSS references, if applicable) as follows:<script src="/Style%20Library/Scripts/{jquery library file}.js" type="text/javascript"></script>Immediately after the jQuery library reference add a script reference to your custom scripts file as follows:<script src="/Style%20Library/Scripts/actions.js" type="text/javascript"></script>Inside your script tag, you can test if jQuery is already defined and if not, then add it to the page.<script type='text/javascript'>  if (typeof jQuery == 'undefined')    document.write('<scr'+'ipt type="text/javascript" src="http://code.jquery.com/jquery-1.6.1.min.js"></sc'+'ript>');</script>For the inquisitive few... Read on if you'd like :)Why jQuery on SharePoiny is AwesomeIt’s all about that visual wow factor.  You can get past that, “But it looks like SharePoint”  Take a long list view and put it into JQuery with pagination, etc and you are the hero.  It’s also about new controls you get with JQuery that you couldn’t do before.Why jQuery with SharePoint should be AwfulAlthough it’s fairly easy to get jQuery up and running. Copy/Paste can cause a problem.  If you don’t understand what it’s doing in the Client Object Model and the Document Object Model then it will do things on your site that were completely unexpected. Many blogs will note workarounds they employed on their sites. Why it’s not working: Debugging “sucks”.You need to develop small blocks of functionality, Test it by putting in some alerts  and console.log. Set breakpoints and monitor the DOM via Firebug and some IE development toolsPerformance - It happens all the time. But you should look at the tradeoffs. More time may give you more functionality.Consistency - ”But it works fine on my computer. So test on many browsers.  Take into account client resourcesHarm the Farm -  You need to code wisely and negatively test.  Don’t be the cause of a DoS attack that’s really JQuery asking for a resource over and over and over again.  So code wisely. Do negative testing. Monitor Server Resources.They also did a demo where JQuery did an endless loop to pull data from a list. It’s a poor decision but also an easy mistake.  They spiked their server resources within a couple seconds and had to shut down the call before it brought it down.ConclusionJQuery is now another tool in your tool kit. You don’t have to use it. Use it where it makes sense and where it helps you get your job done.Don’t abuse it, you will pay for it laterIt will add to page bloat so take that into accountIt can slow your performance

    Read the article

  • Silverlight: Creating great UIs

    - by xamlnotes
    I was always told I was left brained and could not draw. And I bought into that view. Somewhere down the road years ago I did learn to play guitar and to play by ear at that.  Now that’s not all left brained so my right brain must be working.  About a year ago, my good friend Billy Hollis turned me own to a book by Betty Edwards (http://www.drawright.com/).  I started reading this and soon I found my self drawing on napkins in restaurants while we were waiting on food and at many other times too.  Dang’d if I could not draw! Check out my UI article at Dev Pro Connections (Great UIs article) on some of my experiences. Heres a few more links that are really cool too. Cool color combinations web site Simply painting is awesome. Saw this guy on tv. This site has some great tools for color contrasting

    Read the article

  • List of eCommerce sites that use end-to-end SSL?

    - by Jon Schneider
    My development team is considering implementing an eCommerce site using end-to-end SSL -- that is, every page on the site is accessed via an https:// URL -- rather than the more traditional "mixed mode" where most pages are accessed via http:// and only "secure" pages such as login and credit card entry are redirected to https://. Pros of doing such a "pure SSL" approach include avoidance of some session-hijacking attacks such as Firesheep; cons include performance considerations. My question is: Is anyone aware of a list of eCommerce websites (especially USA-based sites), or even specific websites, that use this end-to-end SSL approach? I'm especially interested in "regular" eCommerce sites rather than banks or other "financial" sites.

    Read the article

  • Stopping duplicate H1 and title from dynamic content

    - by codemonkey
    I have a web site where there are lots of dynamically (database driven) created pages. These pages are basically used to show uploaded images The pages look a bit like this URL: http://www.mywebsite.com/page-id/page-title/ H1: View from the sea This is a big issue because I might have 10 other pages with the title: 'View from the sea'. I know the simple solution would be to make sure the pages are named differently but I have lots of users on the web site so it's not that simple. What do you guys think to putting the page-id with the page-title in the H1 tag? So it might read 437 - View from the sea. I need to differentiate the h1 titles. I think using the page-id would help but if anyone has a better solution that would be great! Thanks in advance

    Read the article

  • Forum vs Q&A system

    - by danie7L T
    I would like to know what are the parameters that I have to take into consideration before deciding whether I should incorporate to a website a "Q&A system" or a full forum ? I think forums allow better search capabilities (you can easily dig out old posts) over the "Q&A system", but the latter offer simpler / faster interaction between the users and the site owners. I should add that only a few people (site owners + authorized people) could answer the questions, the user will be on a read-only basis. Anyone can help me decide between the two solutions ? Thank you in advance NB: There is also the impact on the SEOs, are they the same for forums and Q&A systems?

    Read the article

  • Why is Google not indexing my forum?

    - by jsoldi
    I have this web site that as you can see on the top right has is a "Forums" button that links to my forums. My problem is that if, for instance, I make a Google search "site:heliumscraper.com answers" I don't get any result, even though the word "answers" is in the forum's board index and it has been there for a few months already. I don't have a robots file. I also uploaded a sitemap to Google that contained the forum. I'm thinking the problem might be the fact that the only link from my main page to the forum opens in a new window. Could this be the problem?

    Read the article

  • Mail Scanning System

    - by Mr D
    In the same way gmail can generate ads based on email content, I am looking for a way to develop a system which can: Allow users to connect their email address to our site It then would continously monitor all incomming emails From the incomming emails there would be a critera(e.g. a certain address or subject) if any of the emails matched the critea it would would be saved to a database Then once a new email had been found the users would receive an email notification will tells them to log back into the site to see it. My questions are: Would this be possible? What would be a good language to use(generally I like php, python and java) Are there any frameworks which would help do this? How would I connect the users email account to allow access to their emails(do I need a mail server?) Any advice? Thank you! If you need more information please let me know.

    Read the article

  • 302 Redirect causes garbage at end of Wordpress link in Facebook

    - by Joao
    When I try to link my Wordpress blog to Facebook, the url doesn't resolve properly. There's garbage appended at the end and Facebook is not able to retrieve information from the site. Happens in every page, post or main entry. Here's what happens: http://clarissarezende.com.br/ shows up in Facebook as http://clarissarezende.com.br/UPLcS/ (when copy/paste the link) and no information about the site shows up in FB. I'm using Wordpress 3.3.1 with ProPhoto 4. Recently I moved the DNS entry on my ISP. The blog is hosted at clarissarezende.com.br/public_html/blog2 and before the DNS would point to public_html and then I changed it to public_html/blog2. Note that I did not move any Wordpress files. Made the (I think) necessary changes all over Facebook, but still no dice... Any ideas on what can be happening?

    Read the article

  • Google Bot trying to access my web app's sitemap

    - by geekrutherford
    Interesting find today...   I was perusing the event log on our web server today for any unexpected ASP.NET exceptions/errors. Found the following:   Exception information: Exception type: HttpException Exception message: Path '/builder/builder.sitemap' is forbidden. Request information: Request URL: https://www.bondwave.com:443/builder/builder.sitemap Request path: /builder/builder.sitemap User host address: 66.249.71.247 User: Is authenticated: False Authentication Type: Thread account name: NT AUTHORITY\NETWORK SERVICE   At first I thought this was maybe an attempt by a hacker to mess with the sitemap. Using a handy web site (www.network-tools.com) I did a lookup on the IP address and found it was a Google bot trying to crawl the application. In this case, I would expect an exception or 403 since the site requires authentication anyway.

    Read the article

  • How to prevent the google users found my index of admin page?

    - by krish
    I am running a website but for some days i stopped it and put the under-construction page because the Index of admin page is visible to the outside world through the Google search. One of my friend told me that your websites index is visible and its one step away to access the password file and he shows me that very simply using the Google search. How can i prevent this and i am hosting my site with a hosting company and i report about this to them but they simply replied to me still its secure so you no need to worry... am i really don need to worry and continue my site with the visible index of admin page?

    Read the article

  • Windows 2008 R2, UDDI 3.0 and No Admin Links

    - by Andy Morrison
    Windows 2008 R2 might end up giving me a heart attack at some point. Yesterday I installed and configured UDDI 3.0 as part of an ESB 2.0 install & config.  After configuring UDDI 3.0, if I browsed to the localhost/uddi virtual directory from IIS, all of the links would show up in UDDI.  If opened up IE and went to the UDDI site only the Home and Search links would show up. You've probably already guessed at what the "fix" was... I had to Run IE as Administrator.  Then when I browse to the UDDI site all of the links show up.

    Read the article

  • Suggested ways of collecting 1000's of links to MSM media articles

    - by Matt
    I'm currently running a modified Wordpress site that is uniquely designed to simply publish links to other sites, similar to The Drudge Report. Right now I have a few dozen Google Alerts setup and go through each result manually and if it matches a few niche keywords I'm working with, then I add a link to the article to my site. I do the manual checking because sometimes Google Alerts finds links to sites that belong to service providers, organizations, or products, but all I want are mainstream news articles. So my question is there a more efficient - and ideally automated - way to go about performing highly qualitative searches and aggregating such links?

    Read the article

< Previous Page | 239 240 241 242 243 244 245 246 247 248 249 250  | Next Page >