Search Results

Search found 4618 results on 185 pages for 'websites'.

Page 133/185 | < Previous Page | 129 130 131 132 133 134 135 136 137 138 139 140  | Next Page >

  • Domain changes required for SSL integration

    - by user131003
    Currently my site supports regular payment options (User is taken to Payment Gateway/PG website). Now I'm trying to implement "seamless" PG integration. I need SSL for this. I'm having a dedicated server with 5 static IPs from Hostgator/HG. options: I take SSL for www.my_domain.com. According to HG, I need to change IP of main site as current IP is not really dedicated as it is being shared by cpanel etc. So They need to bind another dedicated IP to main domain for SSL to work. This would required DNS change for main website and hence cause few hours downtime (which is ok). I've noticed that most of the e-commerce websites are using subdomains like secure.my_domain.com for ssl/https. This sounds like a better approach. But I've got few doubts in this case: a) Would I need to re-register with existing PGs (Paypal, Google Checkout, Authorize.net) if I switch to subdomain? Re-registering is not an option for me. b) Would DNS change be required for www.my_domain.com in this case. This confusion arose because of following reply from HG : "If the sub domain secure.my_domain.com is added to an existing cPanel it will use the IP for that cPanel so as long as it is a Dedicated IP that will be fine. If secure.my_domain.com gets setup as its own cPanel it will need to be assigned to a Dedicated IP which would have a DNS change involved.". Please suggest?

    Read the article

  • Php profiling on production server or other options

    - by absentx
    Alright I need some help here. I am commonly asked to speed up certain sections of some websites that I program for. I have yet to be able to figure out how to use a good php diagnosis/profiling tool. Some things to consider: The sites I am working on are already built, getting a testing server set up locally is just a huge pain..I have to rewrite include paths and just so many things. This is a results oriented deal and spending days to get a site fully working on a testing platform so I can debug one page probably isn't an option. I can write tons of php, but I have no clue how to interact or mess with servers. So every tutorial I read about setting up xdebug or xhprof all seem to involve getting something installed on a production server that I don't have access to or have no clue how to work with. So are there any solutions out there that will show me where my php is slow without having to do all sorts of server stuff that I just don't know how to do? Xhprof seems to be the closest to useable for me but from what I can tell it still has to be installed on a server. If anyone can just point me in the right direction on this I would be very grateful. Maybe getting these things put on the server isn't a big deal...but I have never interacted with server command lines or anything like that. I suppose I should start sometime but I really have no idea where to start. Plus I realize that profiling on a live platform is not the greatest idea either but I feel I am in a tough spot. I have speed issues to solve and setting up a local environment while a great idea, just doesn't seem real practical at the moment.

    Read the article

  • looking for a short explanation of fuzzy logic

    - by user613326
    Well i got the idea that basics of fuzzy logic are not that hard to grasp. And i got the feeling that someone might explain it to me in like 30 minutes. Just like i understand neural networks and am able to re-create the famous Xor problem. And go just beyond it and create 3 layer networks of x nodes. I'd like to understand fuzzy till a similar usefully level, in c# language. However the problem is face, I'd like to get concept right however i see many websites who include lots of errors in their basic explaining. Like for example showing pictures and use different numbers as shown in pictures to calculate, as if lots of people just copied stuff without noticing what they write down. While others for me go to deep in their math notation) To me that's very annoying to learn from. For me there is no need to re-invent wheel; Aforge already got a fuzzy logic framework. So what i am looking for are some good examples, good examples like how the neural XOR problem is solved. Is there anyone such a instructional resource out there; do you know a web page, or YouTube where it is shortly explained, what would you recommend me ? Note this article comes close; but it just doesnt nail it for me. After that i downloaded a bunch of free PDF's but most are academic and hard to read for me (i'm not English and dont have a special math degree). (i've been looking around a lot for this, good starter material about it is hard to find).

    Read the article

  • Want to tap into a niche market. Do I create new site or bolt on to existing site?

    - by nitbuntu
    Hi, After a lot of heard work and a few years of perseverance, I'm seeing regular sales on my website which have been steadily growing over the past year. However, the entrepreneur in me wants tap into a niche market which I've become very interested in. It's possible to bolt on this niche onto my existing site as an additional category, without it looking too out of place; my new category of products would also benefit from the ranking my current site gets. The kind of people who would purchase these new niche products, however, are very particular and obsessive about detail. So, for example, many Vegetarians would not eat in KFC even if they were to introduce a new range of Veggie burgers. So, I thought it best to create a new website and since my existing site was created using an 'old-school' shopping cart and there are many more up-to-date, feature-rich, ones available now, I wanted to use a different shopping cart system. My dilemma is that I already have 2 websites (1 b2c and another b2b site) and maintaining a 2nd b2c site would end up vastly increasing my workload and I fear that I would not be able to pay adequate attention to all the sites. Moreover, the additional customer service work (e.g. answering emails from many separate email accounts) could end up being too confusing and difficult to maintain. The easy answer would be to take on an employee, but I'm just not earning enough to justify this yet. If anyone has any tips or experience they'd like to share, which could help me answer this question, I'd be highly grateful.

    Read the article

  • Google sitemap HrefLang tag without the main site url

    - by Rashmi Pandit
    We have websites with multilingual content. e.g. http://www.example.com/about-us/ http://www.example.com/en-HK/about-us/ http://www.example.com/en-GB/about-us/ http://www.example.com/zn-CH/about-us/ We need to configure the hreflang tags in sitemap for Google to know that there are alternate links for the same pages in different languages. I know for the above example that my sitemap url tag would look like this: <url> <loc>http://www.example.com/about-us</loc> <xhtml:link rel="alternate" hreflang="en-GB" href="http://www.example.com/en-GB/about-us"/> <xhtml:link rel="alternate" hreflang="en-HK" href="http://www.example.com/en-HK/about-us"/> <xhtml:link rel="alternate" hreflang="zn-CH" href="http://www.example.com/zn-CH/about-us"/> <changefreq>daily</changefreq> <priority>0.8</priority> </url> However, if I don't have the main url but just the last three ones with en-HK, en-GB and zn-CH, then how should my url tag look? Should I just skip the loc tag and keep the three xhtml:link tags? Or can I specify any url in the loc tag and put the remaining two in xhtml:link tags? I am new to Google sitemaps. Any help is greatly appreciated. Thanks, Rashmi Edit: From the answer posted on http://stackoverflow.com/questions/18423624/sitemap-for-domain-with-multilanguage-site/18423803#18423803, for my example with sites in en-HK, en-GB and zn-CH, should there be three url tags, with each of them assigned to loc with the other two in xhtml:link?

    Read the article

  • Trigger IP ban based on request of given file?

    - by Mike Atlas
    I run a website where "x.php" was known to have vulnerabilities. The vulnerability has been fixed and I don't have "x.php" on my site anymore. As such with major public vulnerabilities, it seems script kiddies around are running tools that hitting my site looking for "x.php" in the entire structure of the site - constantly, 24/7. This is wasted bandwidth, traffic and load that I don't really need. Is there a way to trigger a time-based (or permanent) ban to an IP address that tries to access "x.php" anywhere on my site? Perhaps I need a custom 404 PHP page that captures the fact that the request was for "x.php" and then that triggers the ban? How can I do that? Thanks! EDIT: I should add that part of hardening my site, I've started using ZBBlock: This php security script is designed to detect certain behaviors detrimental to websites, or known bad addresses attempting to access your site. It then will send the bad robot (usually) or hacker an authentic 403 FORBIDDEN page with a description of what the problem was. If the attacker persists, then they will be served up a permanently reccurring 503 OVERLOAD message with a 24 hour timeout. But ZBBlock doesn't do quite exactly what I want to do, it does help with other spam/script/hack blocking.

    Read the article

  • Setting up Cluster Configuration using an existing web server as a Primary Node?

    - by RapidWebs
    Thanks in advance for any help which is issued! I am having a slight issue, and need help with the decision making process when it comes to setting up my Cluster Configuration, consisting on a line of Ubuntu Servers (12.04). We currently have a Primary node, which resides in the US within a Datacenter, but we are going to be using this for all serious bandwidth and resource intensive websites, and through a configuration of Virtualmin + Webmin, will be setup as a sort of pseudo-cluster, using Virtualmins Cluster Modules. Anyways, on to the issue: We also have a business line setup locally, with three servers. here are their specs: Intel P4 2.4 ghz, 1GB Ram, 110 gb sata, Ubuntu 12.04* AMD 1.3 ghz, 512MB Ram, 20 GB IDE P3 Xeon 800mhz (dual physical processors), 1GB Ram, 3 * 25 GB Raid Configuration (one in use for host operating system). The first machine is currently IN USE and is serving virtual hosts off a sub-domain. My question is this: How can I integrate the Secondary node (which will be the Primary node per say, in this smaller configuration...) which is currently in use, into the cluster configuration w/ the other two servers for: Sharing Resources Redundancy (HA?) NFS /w the two Raid Disks without having the FORMAT the secondary node, and start fresh moving all my services in to a DRBD network drive or something similar, and than restoring all active virtualmin's Virtual hosts. the idea is that I want minimal downtime to people currently being served from server2.mywebsite.com, and from what I understand, all services need to be on a NFS so that they can be mounted on demand and accessed from the other machine taking over (i.e. Heartbeat + DRBD Config.) but my issue is that i already have all these services installed to their default directory structure: how can i most easily setup this NFS and HA system, move all my desires services to this new drive, and do it with minimal down time, and without breaking Virtualmin and everything else on my server? even just some pointers, a thread i could read, or a step by step check list or run down of commands i could issue to get started would be great! thanks!

    Read the article

  • Advantage of Software Development [on hold]

    - by user93319
    The worth of a brand that a corporation carries is way too high than its physical presence. If you are venturing into an online business, you need to take special care about the corporate image of your business. Nowadays, it is very important for every organization to have its own website. To enhance the online presence of a company it is important to have a good website design as well as the blueprint of the design. A quality site can enhance organizational growth and it can lend a hand an in achieving company’s goals promptly. When websites have gained so much meaning, it is advisable for an organization to seek professional support for the construction of its own exclusive portals. Expert help may again is essential when one needs for a complete renovation on the Web. Any group is necessary to do a bit of introspection before it make a decision to look for the services of a professional web software development company. It is good to be completely clear-headed regarding one’s requirements. To start with, a business should be familiar with who its potential clientele are. Once this main factor has been give consideration, an association can go ahead and get its website designed accordingly. On approaching a corporation that offers software development services India to its customers, a client can make sure that their site is ready with all the most up-to-date features. Professional Assistance Matters Professional service supplier is identified to furnish a site with easy to use features that prompt visitors to come back again and again. Yet one more benefit of receiving aid from professionals is that they can let you know of the type of content that you should place on display over your site. For example, a business that wants to draw the interest of experts belong to the corporate world must make sure that the language used on its website is crisp and official.

    Read the article

  • Controlling access to site folders if you cannot user Roles

    - by DavidMadden
    I find myself on an assignment where I could not use System.Web.Security.Roles.  That meant that I could not use Visual Studio's Website | ASP.NET Configuration.  I had to go about things another way.  The clues were in these two websites:http://www.csharpaspnetarticles.com/2009/02/formsauthentication-ticket-roles-aspnet.htmlhttp://msdn.microsoft.com/en-us/library/b6x6shw7(v=VS.71).aspxhttp://msdn.microsoft.com/en-us/library/b6x6shw7(v=VS.71).aspxYou can set in your web.config the restrictions on folders without having to set the restrictions in multiple folders through their own web.config file.  In my main default.aspx file in my protected subfolder off my main site, I did the following code due to MultiFormAuthentication (MFA) providing the security to this point:        string role = string.Empty;         if (((Login)Session["Login"]).UserLevelID > 3)         {             role = "PowerUser";         }         else         {             role = "Newbie";         }         FormsAuthenticationTicket ticket =  new FormsAuthenticationTicket( 1,                 ((Login)Session["Login"]).UserID,                 DateTime.Now,                 DateTime.Now.AddMinutes(20),                 false,                 role,                 FormsAuthentication.FormsCookiePath);         string hashCookies = FormsAuthentication.Encrypt(ticket);         HttpCookie cookie =  new HttpCookie(FormsAuthentication.FormsCookieName, hashCookies);         Response.Cookies.Add(cookie); This all gave me the ability to change restrictions on folders without having to restart the website or having to do any hard coding.

    Read the article

  • Use jQuery and ASP.NET to Build a News Ticker

    Many websites display a news ticker of one sort or another. A news ticker is a user interface element that displays a subset of a list of items, cycling through them one at a time after a set interval. For example, on Cisco's website there is a news ticker that shows the company's latest news items. Each news item is a one sentence link, such as "Desktop Virtualization Gathers Steam," or "Cisco Reports First Quarter Earnings." Clicking a headline whisks you to a page that shows the full story. Cisco's news ticker shows one headline at a time; every few seconds the currently displayed headline fades out and the next one appears. In total, Cisco has five different headlines - the ticker displays each of the five and then starts back from the beginning. This article is the first in a series that explores how to create your own news ticker widget using jQuery and ASP.NET. jQuery is a free, popular, open-source JavaScript library that simplifies many common client-side tasks, like event handling, DOM manipulation, and Ajax. This article kicks off the series and shows how to build a fairly simple news ticker whose contents can be specified statically in HTML markup or created dynamically from server-side code. Future installments will explore adding bells and whistles, such as: stopping the news ticker rotation when the mouse is hovered over it; adding controls to start, stop and pause the headlines; loading new headlines dynamically using Ajax; and packaging the JavaScript used by the ticker into a jQuery plugin. Read on to learn more! Read More >

    Read the article

  • One site being on a subdirectory of another. Does google count this against you?

    - by Mick
    I have created two similar websites (relating to monetary systems). So far, one appears to be loved by Google and the other hated. I'm struggling to work out why. This is a mystery to me because both sites were created by me with the same design philosophy, both in pure html. Both are packed to the rafters with references to, and information about, their respective subjects. One issue I'm worried may be the cause is to do with the location of the sites. I got a web hosting package from hostmonster.com for the successful one, but less liked one is just an "add-on" which sits on a subdirectory of the successful one. I wonder if Google somehow detects this and treats it as a less significant website? EDIT: Just to clarify, even though one site is an add-on that sits on a subdirectory of the other, the URL is arranged to look like it is a root. I.e. the unpopular site can be accessed directly with a simple www.myunpopularsite.com name, without specifying any subdirectory. EDIT: Just in case its important... say the popular site is called pop.com and the unpopular one unpop.com. In the webspace I've purchased, there is a directory called public_html. This is where I put the index.htm and all the other files of my popular site. When I purchased the add-on unpop.com. I made a subdirectory of public_html called unpop. It is within this "public_html\unpop\" that I place the index.htm and all the other files of my unpopular site. Typing www.unpop.com into the address bar of a browser links directly to the contents of "public_html\unpop\" and the user is not aware that this site is sitting on a subdirectory of another site. BUT if you type "www.pop.com/unpop" into the address bar of a browser you DO see the unpopular site.

    Read the article

  • As a developer, how do I learn sales? [closed]

    - by Dan Abramov
    I quit the company I was working for to pursuit an opportunity as a startup, and I believe in our product. I'm sure it's going to be great if we attract some customers first to keep going. (I don't want funding.) Our product is targeted at private schools and courses, and helps organize the mess other LMSs introduce. The problem is, our team is basically just me and I have very little idea about sales and marketing. I can do reasonably good copywriting but I'm sure I can do better—and being nervous or too techy in a real world conversation with the client doesn't help. I want to get better, in fact, a lot better at negotiating with clients and pitching my product. I did look for some “sales articles” on the web, and a lot of what I found is plain bullshit on SEO-engineered websites promoting books or $5000 courses. What I need instead is a developer's perspective on how to sale a product you think is great. What are typical programmer's mistakes and misconceptions about sales, and how to avoid them? How do you evolve into a reasonably great salesman? I can't believe it's in the mindset and unlearnable. Your own experience, combined with great articles available on the web is most welcome. To Future Readers The question got closed because it is not a good fit for this site. I found some helpful tips in a similar question asked on a sister StackExchange site about startups: I'm a terrible salesperson. What can I do about it?

    Read the article

  • Pausing and Resuming the jQuery / ASP.NET News Ticker

    Many websites display a news ticker of one sort or another. A news ticker is a user interface element that displays a subset of a list of items, cycling through them one at a time after a set interval. In December 2010 I wrote an article titled Use jQuery and ASP.NET to Build a News Ticker that explored how to create your own news ticker widget using jQuery and ASP.NET. The news ticker's content is defined as an unordered list (<ul>) where each list item (<li>) represents a news headline. Once the ticker's content is defined, having it cycle through the head lines is as simple as calling the JavaScript function startTicker(id, numberToShow, duration), which begins cycling the headlines in the unordered list with the specified id, showing numberToShow headlines at a time and cycling to the next headline every duration number of milliseconds. This installment shows how to enhance the news ticker to enable pausing and resuming. With these enhancements, the ticker can be configured to automatically pause rotating its headlines when the user mouses over it, and to resume rotating them once the user mouses out. Similarly, with a bit of additional markup and script you can add pause and play buttons to a ticker, allowing a user to start and stop the ticker by clicking an image or button. Read on to learn more! Read More >

    Read the article

  • What reasons are there to reduce the max-age of a logo to just 8 days? [closed]

    - by callum
    Most websites set max-age=31536000 (1 year) on the Cache-control headers of static assets such as logo images. Examples: YouTube Yahoo Twitter BBC But there is a notable exception: Google's logo has max-age=691200 (8 days). I've checked the headers on the Google logo in the past, and it definitely used to be 1 year. (Also, it used to be part of a sprite, and now it is a standalone logo image, but that's probably another question...) What could be valid technical reasons why they would want to reduce its cache lifetime to just 8 days? Google's homepage is one of the most carefully optimised pages in the world, so I imagine there's a good reason. Edit: Please make sure you understand these points before answering: Nobody uses short max-age lifetimes to allow modifying a static asset in future. When you modify it, you just serve it at a different URL. So no, it's nothing to do with Google doodles. Think about it: even if Google didn't understand this basic trick of HTTP, 8 days still wouldn't be appropriate, as only those users who don't have the original logo cached would see the doodle on doodle-day – and then that group of users would go on seeing the doodle for the following 8 days after Google changed it back :) Web servers do not worry about "filling up" the caches of clients (or proxies). The client manages this by itself – when it hits its own storage limit, it just starts dropping the lowest priority items to make space for new items. The priority score is based on the question "How likely am I to benefit from having cached this URL?", which is nothing to do with what max-age value the server sent when the URL was originally requested; it's a heuristic based on the "frecency" of requests for that URL. The max-age simply lets the server set a cut-off point – the time at which the client is supposed to discard the item regardless of how often it's being re-used. It would be very nice and trusting of a downstream client/proxy to rely on all origin servers "holding back" from filling up their caches, but I don't think we live in that world ;)

    Read the article

  • Why was my site rejected for Google Adsense?

    - by hyuun jjang
    I have a 3 year old blog and its got around 16 articles/tutorials about some programming problems and solutions. It's getting pretty much a lot of view lately so I decided to apply for a google adsense account. When I first applied via blogger, google replied with the following statement: Page Type: In order to participate in Google AdSense, publishers' websites and application information must satisfy the following guidelines: - Your website must be your own top-level domain (www.example.com and not www.example.com/mysite). - You must provide accurate personal information with your application that matches the information on your domain registration. - Your website must contain substantial, original content... So, as I understood it, I decide to buy a domain and point my blogger blog to that new naked domain. and here is the newly bought domain where all the contents of my old blog resides. http://icodeya.com/ I reapplied, hoping that this time, I will make the cut. But then I got this reply Further detail: Unable to review your site: While reviewing http://www.icodeya.com/, we found that your site was down or unavailable. We suggest you check whether there was a typo in the URL submitted. When your site is operational, you can resubmit your application with the correct site by following the directions below. I'm a bit disappointed. Maybe I did something wrong with DNS configuration or something. But you can clearly see that my site is fully functional. I heard that google sends robots to crawl on to the site etc. It's just sad because I invested on a domain name, and now I can't even find ways to earn from it. Any tips?

    Read the article

  • Consolidating multiple domain names

    - by Mike
    I have a client that has three separately hosted copies of their website, each on a separate domain name. The websites are all essentially the same, bar a few discrepancies caused by badly managed updates in the past. I will soon be launching a completely new website for them, at which point, all three domain names are to resolve to the same web server. One domain name will become the default domain name that they refer to in all their literature, and the other two will simply be used as catch-alls for old links, bookmarks, and so on. I would like to know what people consider the best route to achieve this. My plan so far is: Get the new site up and running on the new webserver. Change the relevant A record of the default domain name to point to the new webserver. a) Keep the existing hosting accounts in operation. Create a list of 301 redirects from old page names on the old site to new page names on the new site. or b) Configure CNAME records for the non-default domain names, each pointing to the new webserver. Create a list of 301 redirects on the new site that redirect from old page names to new page names. If my understanding is correct, 3a will help to maintain whatever search engine rankings the sites already have (I know it's not going to be perfect), while at the same time informing search engines that the old domain names are no longer in use. What's a good approach to take here?

    Read the article

  • Does the google crawler really guess URL patterns and index pages that were never linked against?

    - by Dominik
    I'm experiencing problems with indexed pages which were (probably) never linked to. Here's the setup: Data-Server: Application with RESTful interface which provides the data Website A: Provides the data of (1) at http://website-a.example.com/?id=RESOURCE_ID Website B: Provides the data of (1) at http://website-b.example.com/?id=OTHER_RESOURCE_ID So the whole, non-private data is stored on (1) and the websites (2) and (3) can fetch and display this data, which is a representation of the data with additional cross-linking between those. In fact, the URL /?id=1 of website-a points to the same resource as /?id=1 of website-b. However, the resource id:1 is useless at website-b. Unfortunately, the google index for website-b now contains several links of resources belonging to website-a and vice versa. I "heard" that the google crawler tries to determine the URL-pattern (which makes sense for deciding which page should go into the index and which not) and furthermore guesses other URLs by trying different values (like "I know that id 1 exists, let's try 2, 3, 4, ..."). Is there any evidence that the google crawler really behaves that way (which I doubt). My guess is that the google crawler submitted a HTML-Form and somehow got links to those unwanted resources. I found some similar posted questions about that, including "Google webmaster central: indexing and posting false pages" [link removed] however, none of those pages give an evidence.

    Read the article

  • How to stay productive? What time management software is available?

    - by andrewsomething
    So since I started using askubuntu.com I've spent entirely too much time here answering other people's questions. Now maybe someone could help me with that by answering this one. I'm looking for time management software for Ubuntu. There are a number of these programs floating around for Windows. RescueTime is one that is very popular. The key features that I'd like to see in a linux app that RescueTime has are: Automatically records what application you are using, including what websites you visit. Reports and graphs on your time usage. Notifications for when you have spent too much time on "distractions." While RescueTime doesn't officially support linux, there is an open source RescueTime Linux Uploader. Unfortunately, it seems to only support Firefox and Epiphany for website tracking. I'm a Chromium user. The other major drawback to RescueTime is that it is a web service. I'd much rather not upload detailed information about how I spend my time to some third party. Google already knows too much about me as it is. Project Hamster, a GNOME time management app, comes so close. Sadly, it does not automatically track what you are doing. If I had enough discipline to manually report to an applet what I was up to, I doubt I'd need this. (How cool would it be if they provided some Zeitgeist integration to handle that part?)

    Read the article

  • Who is a CMS really for?

    - by Eirc man
    I have started lately discovering Content Management Systems, and I was wondering, who is really CMS for? What I mean by that: is it only for companies, small businesses or individuals, that pays a contractor to make a website that it's users can just upload content through a easy interface. Or is it used also by programmers, to build their own websites, projects? Would a Facebook, Tweeter, StackExhange ever started by using a CMS, a very powerful one for example. Would you as a programmer build your own "fancy" website on top of a CMS, for example like Typo3, or you would build it from scratch? P.S To be more clear is a summary: What I mean to begin with is, would I as a developer choose a CMS to develop a website that can be scaled with a big base of users, be stuck if I choose to start with a CMS system. What if I build a website using CMS, and the website explodes in popularity, and then I wanted to add much more functionality that I have planed, is it possible that the CMS will limit the growth, because it might have not been build for that kind of scale?

    Read the article

  • PHP safe_mode is a pain, looking for advice (Ubuntu 12.04 server, public webserver)

    - by user73279
    Maybe askUbuntu isn't the right forum or I haven't provided the right search query but I haven't seen anything in my searching of askUbuntu on PHP safe_mode. I get lots of Windows Safe Mode and Ubuntu Safe Mode results but not PHP safe_mode. So I keep running into one issue after another regarding PHP safe_mode. (I write a lot of my own PHP code for various site maintenance tools and such.) I know safe_mode is going away in the next version of PHP but I still see a fair amount of advice recommending that you leave it enabled. I've recently consolidated from 3 servers down to 1 and at least one of those old servers had safe_mode disabled without any issues. (The lack of issues may have simply been a matter of good luck.) None of the previous 3 gave me this much trouble so I'm guessing so additional php.ini/PHP safe_mode setting was turned on for the new server. I primarily run WordPress for my websites with a few MediaWiki sites sprinkled in. And I am currently running into an issue using WordPress's auto update feature as it doesn't seem to be able to use fopen. WordPress is not relaying the actual error message to me but since I was just able to update the plugins I'm using this is a safe_mode problem. I've had a lot of safe_mode issues since consolidating to this new server. Long story short, the advice I'd seen to use safe_mode was all at least 2 years old. Do I really need it? If I disable PHP safe_mode are there a good set of security measures I should implement - i.e. chmod 640 /var/www/..., add this to your .htaccess, etc - to protect my server/sites? Thanks

    Read the article

  • How to decide on a price for the project as a freelancer

    - by Shekhar_Pro
    I have seen similar question on this SE site but none comes close to a sure shot answer and many are rather subjective. So i am taking a website as an example to be more objective for you to decide its development price i should quote for the complete work.I would like to have specific figures. In past I have developed many projects for my classmates (Computer science and few .net) when i was in college and there i just arbitrarily quoted the price i will take depending on my mood and customer's ability to pay.. usually ranging from Rs.500 (about $10 USD) to Rs. 1500 (about $30 USD). I have also developed few websites but that was open-source and free. But this time impressed by my work i have got a client that wants to get a website developed similar to this: [ http://www.jeetle.in/ ]. So taking this website as an example tell me how much should i charge for complete work from designing to payment gateway implementation (Excluding the charge the payment gateway provider will take). Few information you might like to consider. I am the only developer on this project if that makes any difference. And i would be using ASP.Net and MSSQL Express for server side processing and jQuery on client. Time period for development offered is about 4 to 6 Weeks. Its like i know my work but not how much I'm worth

    Read the article

  • Programming habits, patterns, and standards that have developed out of appeal to tradition/by mistake? [closed]

    - by user828584
    Being self-taught, the vast majority of what I know about programming has come from reading other peoples' code on websites like this. I'm starting to wonder if I've developed bad or otherwise pointless habits from other people, or even just made invalid assumptions. For example, in javascript, void 0 is used in a lot of places, and until I saw this, I just assumed it was necessary and that 0 had some significance. Also, the http header, referer is misspelled but hasn't been changed because it would break a lot of applications. Also mentioned in Code Complete 2: The architecture should describe the motivations for all major decisions. Be wary of “we’ve always done it that way” justifications. One story goes that Beth wanted to cook a pot roast according to an award-winning pot roast recipe handed down in her husband’s family. Her husband, Abdul, said that his mother had taught him to sprinkle it with salt and pepper, cut both ends off, put it in the pan, cover it, and cook it. Beth asked, “Why do you cut both ends off?” Abdul said, “I don’t know. I’ve always done it that way. Let me ask my mother.” He called her, and she said, “I don’t know. I’ve always done it that way. Let me ask your grandmother.” She called his grandmother, who said, “I don’t know why you do it that way. I did it that way because it was too big to fit in my pan.” What are some other examples of this?

    Read the article

  • Country specific content vs global content

    - by Ando
    I have a global product presentation website myproduct.com For certain countries I also own the country domain: myproduct.co.uk, myproduct.com.au, myproduct.es, myproduct.de, etc. The presentation website is translated in multiple languages and I set up redirects: myproduct.es will redirect to myproduct.com/es/, myproduct.de will redirect to myproduct.com/de/, etc. . The content so far is the same, just translated in different languages. The advantages are that it's easy to keep the content aligned - everything is managed from one centralized dashboard (I'm using Wordpress with qtranslate). Now I'm running into trouble as for different countries I want localized content - for UK I want to run different promotions and use a different reseller than for .com.au so I would like that users coming from myproduct.co.uk see something different than those coming from myproduct.com.au (and not be redirected to myproduct.com as they are right now). How can I achieve this? I could duplicate the whole main website and modify only certain parts but then I would have a lot of duplicate content (e.g. info about how the product works) and I would have pages that are likely to change (FAQ page) that I would have to keep updated over all websites. I can duplicate only partially the main website: on the localized website I would have only the pages that are different and then all other links would point to the .com site. This would solve the duplication problem but would cause confusion for the user as you would navigate from .co.uk to .com without noticing and then wonder how to get back. Other, better option?

    Read the article

  • Self-serv advertising service

    - by Mystere Man
    I am seeking a self-serv advertising service for my websites, but I have a few restrictions that seem to make what i'm looking for hard to find. Specifically, I want to place "advertise here" links on my pages and allow end-users to purchase advertising on that site, page, and location. These ads will not be part of a national network. Supports multi-tenancy - That is, I have a number of domains using the same "web application" but with customized content per domain. When a customer wants to advertise on a given domain, then the ads will only appear on that domain and on that page of the domain (even though the page name may be the same across multiple domains). Supports fixed ad prices, not just CPC. I need monthly and quarterly pricing regardless of performance. Integrates with OpenX and other ad networks, so that if there is no self-serv on a given zone, it will use national advertising or direct advertising. Shiny Ads has much of this, but i'm looking for alternatives, as their prices are a bit crazy (20%) and can only do PayPal.

    Read the article

  • Is there any place to find real-world usage-style tutorials for programming languages?

    - by OleDid
    Let's face it. When you want to learn something completely new, be it mathematics or foreign languages, it's easiest to learn when you get real world scenarios in front of you, with theory applied. For example, trigonometry can be extremely interesting when applied to creation of 2D platform games. Norwegian can be really interesting to learn if you live in Norway. When I try to look at a new programming language, I always find these steps the hardest: What tools do I need to compile and how do I do it Introduction-step: Why is this programming language so cool? Where and how is it used? (The step I am looking for, real-world scenarios) The rest, deep diving into the language, pure theory and such, is often much easier if you have completed step 1 and 2. Because now you know what it's all about, and can just read the specification when you need to. What I ask is, do you have any recommendations for places I can find such material for programming languages? Be it websites or companies selling books in this style, I'm interested. Also, I am interested in all languages. (If I had found a "real-world usage" explained for even INTERCAL, I would be interested). In some other thread here, I found a book called "Seven Languages in Seven Weeks". This is kind of what I am looking for, but I believe there must be "more like this".

    Read the article

< Previous Page | 129 130 131 132 133 134 135 136 137 138 139 140  | Next Page >