Search Results

Search found 20353 results on 815 pages for 'website review'.

Page 284/815 | < Previous Page | 280 281 282 283 284 285 286 287 288 289 290 291  | Next Page >

  • IE8 complains about SSL name mistmatch

    - by Cerin
    When visiting an SSL protected website, IE8 complains about the certificate name not matching the website address, but gives no information about the certificate or what name it's looking for. Visiting the same site in IE9 (or IE9 in "IE8 mode"), Firefox, Chrome, and Safari shows no problems, and that the certificate matches the address. Certificate checkers indicate everything is installed and configured correctly. Does anyone know what might be causing this? Is this a known issue or bug in IE8? I've been Googling for similar issues, but due to the uncertainty as to what's actually going on, I'm not sure what to search for. My problem reads similar to this question. However, my server is running Apache2.

    Read the article

  • Radeon 9200 Driver?

    - by usuhh86
    I've been using Linux for a while, but haven't really done more than installed programs. I have Ubuntu 12.04 with an ATI Radeon 9200 graphics card. Ubuntu didn't recognize it during the install. All I want is a driver. Preferably the proprietary driver, but I'll settle for an open-source one if I need to. I went to the AMD support website, and whenever I click on the download link for "ATI Proprietary Linux x86 Display Driver 8.28.8" I get redirected to the main AMD website. I tried this on Firefox, Chrome, even booted up my netbook and tried it on IE9. Does anybody have a download link? And if not, a link to download an open-source alternative? Any help will be greatly appreciated. I'm pretty much still a newbie at this. OpenGL vendor string: Tungsten Graphics, Inc. OpenGL renderer string: Mesa DRI R200 (RV280 5961) x86/MMX/SSE2 TCL DRI2 OpenGL version string: 1.3 Mesa 8.0.2 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: no GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: no Unity 3D supported: no

    Read the article

  • Web Hosting Checklist

    - by Chris
    Hello, I am a web developer that is starting to look into hosting his own website. I would like to showcase my programming skills (PHP, MySQl, C#, Wordpress). My knowledge of languages I am OK with but the actually hosting site is where my knowledge starts to get a little shaky. I know the basics (bandwidth, sub-domains, re-write rules) but I would love your input, to help me formulate a check list of certain web-hosting services that I should be on the look-out for. Also I was wondering if there were any reliable hosting providers who give you the option to host both c# code-behinds and PHP code. As I would like to have two versions of my site, one in C# and one in PHP the hope is that if I need to look for another job this website will help me show possible employers my server side knowledge. I hope this is enough info, I did some researching online but found a bunch of unless articles and I've always have had luck on the StackExchange sites. So hopefully you, can help me. Thanks alot.

    Read the article

  • Google I/O 2010 - OpenSocial in the Enterprise

    Google I/O 2010 - OpenSocial in the Enterprise Google I/O 2010 - Best practices for implementing OpenSocial in the Enterprise Social Web, Enterprise 201 Mark Weitzel, Matt Tucker, Mark Halvorson, Helen Chen, Chris Schalk Enterprise deployments of OpenSocial technologies brings an additional set of considerations that may not be apparent in a traditional social network implementation. In this session, several enterprise vendors will demonstrate how they've been working together to address these issues in a collection of "Best Practices". This session will also provide a review of existing challenges for enterprise implementations of OpenSocial. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 5 0 ratings Time: 38:23 More in Science & Technology

    Read the article

  • Ops Center Update 2 is available!

    - by Owen Allen
    Oracle Enterprise Manager Ops Center Release 1, Update 2 (12.1.2.0.0) is available. This release includes support for Oracle Solaris 11.1 and Oracle Linux 6.3, the ability to attach a network multiple times to an LDom guest using the same vswitch, support for HMP 2.2.3, improved options for adding users from remote directory servers, and a few other useful features. A list of new features is in What's New. You can view the documentation online, or you can download a zip file of the library from the Overview tab. If you're running Ops Center in connected mode, the updates are delivered to the UI. If you are in disconnected mode or have not yet installed Ops Center, you can go to the Ops Center download page to get the latest release. Before installing or upgrading to Ops Center 12c Update 2, review the Release Notes for things to watch out for, then see the upgrade instructions in the Administration Guide.

    Read the article

  • Easiest Driver Back Up solution ?

    - by mgpyone
    I've to faced kinda of missing drivers after formatted Windows .. Especially, Graphic Drivers, Touch-pad driver, finger print driver and so forth are sensitive . They do not work without installing drivers. I can even download at its official website . But some drivers are not found in the website. .. Thus, I'm looking for a solution ( may be a software ) that can backup my drivers effectively ( if it can do with schedule, it'll be great) and can restore easily locally . Espcially, for Win XP and Vista. Any suggestions will be appreciated really.

    Read the article

  • Why is Ubuntu offline (except torrents) while Windows is online?

    - by Fahim al Islam
    I am using a static wired connection. Everything was perfect. But suddenly from few hours back I can't access any website. Dropbox, Ubuntu One also can't connect. Ping request is also unsuccessful, but I can download through torrent. I am not trying torrent download and browsing at the same time. So, I think it's not an issue about torrent using all the bandwidth. One important point is that this connection works perfectly on Windows on this same PC (My PC is dual-boot). I have tried the way what izx has suggested (using "sudo sh -c 'echo nameserver 8.8.8.8 /etc/resolv.conf'"), but I'm facing the same problem again. Now I can't even ping 8.8.8.8 and google.com. Though I can ping 74.125.228.2 (which is Google IP address) I can't understand what's happening and why this is happening. I'm new in this website many rules and regulations is unknown to me. So, please don't be bothered for my mistakes. Looking forward for help from anyone. Thanks to all.

    Read the article

  • How should I manage a team with different skill levels?

    - by Jon Purdy
    I'll be working on a software project with some friends of mine, and I've been appointed technical lead. None of these guys is a bad programmer at all, but I do have significantly more experience than them. I need to be able to distribute the work among everyone on the team, while also making sure that we don't tread on one another's toes; that they meet the relatively high standards of quality and scalability that we need to make this project successful, without requiring me to review everything they commit. How should I maintain standards while avoiding micromanagement? Is it enough to make some diagrams, schedule some code reviews, and trust that I'll be able to fix anything that they might break, or should I go the TDD route and write explicit tests for the team to satisfy?

    Read the article

  • Want to tap into a niche market. Do I create new site or bolt on to existing site?

    - by nitbuntu
    Hi, After a lot of heard work and a few years of perseverance, I'm seeing regular sales on my website which have been steadily growing over the past year. However, the entrepreneur in me wants tap into a niche market which I've become very interested in. It's possible to bolt on this niche onto my existing site as an additional category, without it looking too out of place; my new category of products would also benefit from the ranking my current site gets. The kind of people who would purchase these new niche products, however, are very particular and obsessive about detail. So, for example, many Vegetarians would not eat in KFC even if they were to introduce a new range of Veggie burgers. So, I thought it best to create a new website and since my existing site was created using an 'old-school' shopping cart and there are many more up-to-date, feature-rich, ones available now, I wanted to use a different shopping cart system. My dilemma is that I already have 2 websites (1 b2c and another b2b site) and maintaining a 2nd b2c site would end up vastly increasing my workload and I fear that I would not be able to pay adequate attention to all the sites. Moreover, the additional customer service work (e.g. answering emails from many separate email accounts) could end up being too confusing and difficult to maintain. The easy answer would be to take on an employee, but I'm just not earning enough to justify this yet. If anyone has any tips or experience they'd like to share, which could help me answer this question, I'd be highly grateful.

    Read the article

  • Planning to buy a server with at least 48GB Ram, are the blades way to go?

    - by varchar1
    We're planning to host our website for the first time for ourselves. We have currently have a linode of 8 gigs and the memory is going up to 90% most of the time. So I want to move my website to my own server with huge RAM. So this will be first time to manage any physical hardware of a server. So I came across IBM's BladeCenter, found them interesting. So can I just buy the blade and run it? Or do I have to buy the chassis for sure? Also, do I need to buy an UPS? So how hard is it to setup? How about the hard drives? Can I setup them easily? Please advice.

    Read the article

  • Nginx Ip Whitelist

    - by Will
    Is it possible to create a ip whitelist for my nginx proxy server without adding allow or deny in the config file is it possible i can get nginx to link to a separate database to check if the user is allowed to access the website . Ideally i could do with nginx linking to an external database or at minimum a list off allowed ips on the same server so i can easily update the list whit out restarting nginx every time. In the future i would like to link nginx to my website and a user will login and there ip will be linked to there account and they will be able to update there ip if it has changed to there new one to grant them access so i need to keep in mind that it would be easyer to do this if i have external list off ips in some kind off database any help is apreshiated

    Read the article

  • CLR via C# - first post of many!

    - by TATWORTH
    I am currently reading CLR via C# ISBN 978-0-7356-2704-8. Whilst quite correctly described by the publisher as a "Deep Dive", this is a book that C# developers with 6-18 months plus experiance ought to read. Certainly any serious Microsoft programming shop ought to have a copy.  For our VB.NET bretheren, a book of this quality is a good excuse to learn C#. (And before you ask, my favourite language of C# and VB.NET is the one that gets me the next contract!) When I started programming 31 years ago I went through IBM 360 Orientation - this gave me an comprehension of what worked best at the machine code level - this is the first book I have found that explains the the working of the Dot Net framework to explain why particular choices are good, This is my first blog post here. In the coming weeks, I intend to: Carry on with my review of CLR via C# and bring out practical points from that work. Post details of useful utilities Post some "Tales from the coal face.."

    Read the article

  • both IPV4 and IPV6 at the same time over DSL connection?

    - by namiheike
    Let me describe my situation: while I connect computer with the wire, I've got an IPV6 address automatically, there's a "Wired connection" tab in network manager, and I can access an website that support IPV6 (google,facebook,twitter...)with a hosts file, or use the proxy like google.com.sixxs.org But if I want to access the whole internet, I have to create a DSL connection with username and password that ISP gave me. BUT after I change my connection into this DSL connection, I cannot access website over IPV6, even there's the site's ipv6 address in /etc/hosts, then I realize that I lose my IPV6 connection, because the ping6 says connect: Network is unreachable. the problem is, there's no IPV6 tab or options about IPV6 in the configure of DSL connection. It feels like I can only use one connection at the same time, but the DSL doesn't support IPV6 and the wired connection doesn't support IPV4(I mean, there's no way to input the password the ISP gave me) maybe make somebody uncomfortable, but when I work in MS windows, there's no such problem, (maybe just feel like) I can access V4 and V6 at the same time. So how to solve with this? thanks a lot. I'm in 11.10 + gnome3

    Read the article

  • Controlling access to site folders if you cannot user Roles

    - by DavidMadden
    I find myself on an assignment where I could not use System.Web.Security.Roles.  That meant that I could not use Visual Studio's Website | ASP.NET Configuration.  I had to go about things another way.  The clues were in these two websites:http://www.csharpaspnetarticles.com/2009/02/formsauthentication-ticket-roles-aspnet.htmlhttp://msdn.microsoft.com/en-us/library/b6x6shw7(v=VS.71).aspxhttp://msdn.microsoft.com/en-us/library/b6x6shw7(v=VS.71).aspxYou can set in your web.config the restrictions on folders without having to set the restrictions in multiple folders through their own web.config file.  In my main default.aspx file in my protected subfolder off my main site, I did the following code due to MultiFormAuthentication (MFA) providing the security to this point:        string role = string.Empty;         if (((Login)Session["Login"]).UserLevelID > 3)         {             role = "PowerUser";         }         else         {             role = "Newbie";         }         FormsAuthenticationTicket ticket =  new FormsAuthenticationTicket( 1,                 ((Login)Session["Login"]).UserID,                 DateTime.Now,                 DateTime.Now.AddMinutes(20),                 false,                 role,                 FormsAuthentication.FormsCookiePath);         string hashCookies = FormsAuthentication.Encrypt(ticket);         HttpCookie cookie =  new HttpCookie(FormsAuthentication.FormsCookieName, hashCookies);         Response.Cookies.Add(cookie); This all gave me the ability to change restrictions on folders without having to restart the website or having to do any hard coding.

    Read the article

  • What impact would a young developer in a consultancy struggling on a project have?

    - by blade3
    I am a youngish developer (working for 3 yrs). I took a job 3 months ago as an IT consultant (for the first time, I'm a consultant). In my first project, all went will till the later stages where I ran into problems with Windows/WMI (lack of documentation etc). As important as it is to not leave surprises for the client, this did happen. I was supposed to go back to finish the project about a month and a half ago, after getting a date scheduled, but this did not happen either. The project (code) was slightly rushed too and went through QA (no idea what the results are). My probation review is in a few weeks time, and I was wondering, what sort of impact would this have? My manager hasn't mentioned this project to me and apart from this, everything's been ok and he has even said, at the beginning, if you are tight on time just ask for more, so he has been accomodating (At this time, I was doing well, the problems came later).

    Read the article

  • SEO title tag and earning a high rank on search engines [closed]

    - by Josh White
    Possible Duplicate: What are the best ways to increase your site's position in Google? One of the most basic SEO techiniques is including accurate description below 64 characters in the tags of each page. I was wondering if is considered ethical SEO to set up the contents based on a search keyword for example. So if the user searches for 'apples pictures' for example, then the title of the webpage would be 'apple pictures'. Note that the search keywords accurately describe my website contents because the title will always relate to the body of the webpage and 85-90% of the terms searched for will return corresponding results. Is this considered a good seo practice and is it ethical? Also, can someone explain what the idea is behind "linking"? I read somewhere that it is a good seo practice to link other websites and it is good when other websites link you. Does this mean that I should include as many links to other websites as possible (that are somehow relevant to my websites goal), also if I joined forums/services and posted my website url in the signature, would that still be considered other websites linking me?

    Read the article

  • HTTP traffic through PIX VPN from outside site

    - by fwrawx
    I have a remote site with a website that only allows access from the outside IP assigned to our local PIX. I have users connecting to the local networking using a VPN that need to be able to view this remote site. I don't think this works because the packets want to come in and go out over the same (ext) interface. So I'm looking for a way to make this work using the PIX or setting up a service on a server on the local network to act as a middle-man for the HTTP requests. The remote site doesn't support setting up a VPN to our PIX. The remote website is dishing out pages over a non-standard port. Can I use squid or something similar to proxy just one site?

    Read the article

  • Basic Google Analytics Click Tracking and/or Overview

    - by Alan Storm
    This is a really basic Google Analytics question. Apologies in advance if it's not appropriate here, but I've had a lot of luck on Stack Overflow and this seems like the best Stack Exchange site for a question like this. I'm trying to understand how Google Analytics goals work, or if they're the right feature to be using for my situation. Most of the documentation I find online refers to the old version of the UI, not the new one. I have a website, let's call is blog.example.com. This website drives traffic to an ecommerce store, let's call that store.example2.com. I want to get reports on which links from blog.example.com are being clicked through leading to store.example2.com. How do you do this in Google analytics? Are goals the right area to be looking? Do I setup the goals on store.example2.com or blog.example.com? Or both? Is there any canonical user guide (free or paid) that covers how this works? I'm a competent programmer, but it's years since I dealt with conversion tracking on any serious level, and we've progressed well beyond my frozen caveman pixel tracking knowledge. Thanks in advance

    Read the article

  • Get Ready For C# 4.0!

    Visual Studio 2010 is here! And of course this means that C# 4.0 is also here. Lets do a quick review of the new language features added in this release. Dynamic The dynamic keyword is a key feature of this release. It closes the gap between dynamic and statically-typed languages. Now you can create dynamic objects and let their types be determined at run time. With the addition of the System.Dynamic namespace, you can create expandable objects and advanced class wrappers, and you can provide interoperability...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Allow from referer for HTTP-basic protected SSL apache site

    - by user64204
    I have an apache site protected by HTTP basic authentication. The authentication is working fine. Now I would like to bypass authentication for users that are coming from a particular website by relying on the HTTP Referer header. Here is the configuration: SetEnvIf Referer "^http://.*.example\.org" coming_from_example_org <Directory /var/www/> Options Indexes FollowSymLinks MultiViews AllowOverride None Deny from all Allow from env=coming_from_example_org AuthName "login required" AuthUserFile /opt/http_basic_usernames_and_passwords AuthType Basic Require valid-user Satisfy Any </Directory> This is working fine for HTTP, but failing for HTTPS. My understanding is that in order to inspect the HTTP headers, the SSL handshake must be completed, but apache wants to inspect the <Directory> directives before doing the SSL handshake, even if I place them at the bottom of the configuration file. Q: How could I workaround this issue? PS: I'm not obsessed with the HTTP referer header, I could use other options that would allow users from a known website to bypass authantication.

    Read the article

  • How can I handle a .org domain on my own nameserver without paying for unwanted services?

    - by etuardu
    I have a dot org domain that I use to run a website. Until now, I had an account onto a hosting+domain provider. Recently I thought to run the website on my own webserver and to handle the domain on my own nameserver. What do I need to do in order to handle my .org domain by my own? Do I still need a registrar? Is there a more direct way that pir.org provide in order to fill in just a nameserver to be bound to a domain name?

    Read the article

  • How Google Web Starter Kit serves adaptive image for mobile?

    - by 5argon
    My website weirdly (in a good way) serves smaller images when viewed on mobile. I wanted to know what cause this? As far as I know this is not the default behaviour, so I think it must be Google Web Starter Kit's doing.Here is the debug information when debugging on device. All images became 231 B size no matter how large it actually is. (On desktop debugging the size varies.) I tried using Google Web Starter Kit (https://github.com/google/web-starter-kit) recently. The tools in it are made of Ruby, Node.js, SASS and Gulp to help you 'build' website. Pre-build you can enjoy automatic reload because the Gulp script will watch all files for you. When build it will run various tools to minify HTML,CSS and compress images. According to this page https://developers.google.com/web/fundamentals/tools/build/build_site the gulp-imagemin was used. So I guess the imagemin is doing the mobile optimization for me? What kind of compression can serve automatically resized image on mobile? And why is the size 231 B? Is this related to my screen size?

    Read the article

  • Please recommend tools for PC, browser, home network performance problems?

    - by mobibob
    My client is experiencing some odd response behavior in their browser for the past few days. Classic, "nothing has changed" so I am starting at ground zero. Browsing a website will timeout or take a ridiculous time to load -- other times, the same site and query is immediately responsive. Once a connection is established, video streams are uninterrupted. The home network hosts a website, but it is not experiencing any activity in Apache's 'access.log' I am using speedtest.net to check if the ISP through the internet is 'OK' -- which looks typical (average +/-). I have to suspect the home network is beaconing or something very abnormal, but I don't know where to start.

    Read the article

  • Why was my site rejected for Google Adsense?

    - by hyuun jjang
    I have a 3 year old blog and its got around 16 articles/tutorials about some programming problems and solutions. It's getting pretty much a lot of view lately so I decided to apply for a google adsense account. When I first applied via blogger, google replied with the following statement: Page Type: In order to participate in Google AdSense, publishers' websites and application information must satisfy the following guidelines: - Your website must be your own top-level domain (www.example.com and not www.example.com/mysite). - You must provide accurate personal information with your application that matches the information on your domain registration. - Your website must contain substantial, original content... So, as I understood it, I decide to buy a domain and point my blogger blog to that new naked domain. and here is the newly bought domain where all the contents of my old blog resides. http://icodeya.com/ I reapplied, hoping that this time, I will make the cut. But then I got this reply Further detail: Unable to review your site: While reviewing http://www.icodeya.com/, we found that your site was down or unavailable. We suggest you check whether there was a typo in the URL submitted. When your site is operational, you can resubmit your application with the correct site by following the directions below. I'm a bit disappointed. Maybe I did something wrong with DNS configuration or something. But you can clearly see that my site is fully functional. I heard that google sends robots to crawl on to the site etc. It's just sad because I invested on a domain name, and now I can't even find ways to earn from it. Any tips?

    Read the article

  • Distribute terrabyte files to the public from web server

    - by MarkJ
    Hi We need to set up a website which makes two or three large files publicly available - the files will be 1 or 2 terrabytes each. Although they will be public, in practise I expect only a relatively small number of scientists will want to download them. What is the best way to allow this? I've had a quick talk to a web-hosting provider (rackspace) and they suggested a hybrid solution. An entry-level managed server (we predict fairly low traffic for the website, but we do need to install some custom CGI software). Some cloud storage which hooks into Limelight Networks. This would host the large files, for download by FTP. It sounded OK to me but I know relatively little about server administration. Does it make sense? Thanks in advance, Mark

    Read the article

< Previous Page | 280 281 282 283 284 285 286 287 288 289 290 291  | Next Page >