Search Results

Search found 39224 results on 1569 pages for 'content delivery network'.

Page 593/1569 | < Previous Page | 589 590 591 592 593 594 595 596 597 598 599 600  | Next Page >

  • Can I set up a COM Port that Connects via TCP/IP?

    - by Kristopher Johnson
    I have an RS232 device connected to a Digi PortServer on my network. I can telnet to the Digi's IP and port number to communicate with that RS232 device. I have a Windows application that knows how to talk to the device, but it wants to connect directly to an RS232 port. So, my question: Is there some way to set up a COM port so that when this application connects to it, it goes through the network to the Digi to communicate with the device? I have run across the utility TCP-Com, which seems like it might be a solution. Is this the best option?

    Read the article

  • How to solve 404 for static files with Django and Nginx?

    - by Lucio
    I setup a Trusty VM with Django + Nginx (other stuffs too). The server is working, I get the "Welcome to Django" page. But when I enter to servername/admin it loads the HTML page but fails to load the static content. And the admin page have this links to static content: <link rel="stylesheet" type="text/css" href="/static/admin/css/base.css" /> <link rel="stylesheet" type="text/css" href="/static/admin/css/login.css" /> Both of the CSS files give me 404, as the Nginx log shows: 192.168.56.1 - - [05/Jun/2014:12:04:09 -0300] "GET /admin HTTP/1.1" 301 5 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:29.0) Gecko/20100101 Firefox/29.0" 192.168.56.1 - - [05/Jun/2014:12:04:09 -0300] "GET /admin/ HTTP/1.1" 200 833 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:29.0) Gecko/20100101 Firefox/29.0" 192.168.56.1 - - [05/Jun/2014:12:04:10 -0300] "GET /static/admin/css/base.css HTTP/1.1" 404 142 "http://ubuntu-server/admin/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:29.0) Gecko/20100101 Firefox/29.0" 192.168.56.1 - - [05/Jun/2014:12:04:10 -0300] "GET /static/admin/css/login.css HTTP/1.1" 404 142 "http://ubuntu-server/admin/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:29.0) Gecko/20100101 Firefox/29.0" I think that the error is on my nginx.conf file, but do not know how to solve it.

    Read the article

  • Profit at Oracle OpenWorld 2012

    - by user462779
    It's only a week away: Oracle OpenWorld descends on San Francisco from September 30 to October 4. It's always a frantic week for the Profit editorial staff, but here's a few thing we've got going in San Francisco that you'll want to watch out for: Profit on Oracle OpenWorld Live: The Oracle video team will be broadcasting live from the event all week. I have a few interesting on-air interviews booked, including a conversation with business/technology researcher Andrew Mcafee (Monday Oct 1 @ 11:45am), Acorn Paper CEO David Weissberg (Tuesday, Oct 2 @ 12:15pm) and Abhay Parasnis, Oracle Senior Vice President, Oracle Public Cloud (Wednesday, Oct 3, @ 10:45am). Profit in the Oracle Partner Network Lounge: This summer, I worked with the amazing Oracle Partner Network (OPN) team to create the Profit Oracle Specialized Partner Edition 2012. It's a great catalog of Oracle partner success stories and insight into the OPN strategy from its leadership. Look for the special issue of Profit in the Oracle PartnerNetwork Lounge: the place where partners can meet formally or informally with colleagues, customers, prospects, and other industry professionals. Moscone South, Exhibit Hall, Room 100 Oracle Customer Experience Summit @ OpenWorld: There's been a lot of discussion within my editorial team (and content published, as well)about Customer Experience. To keep pace with this evolving subject, I'll be attending this special embedded conference on Wednesday and Thursday (Oct. 3-4). Especially looking forward to Seth Godin's presentation: he was one of the first experts we interviewed forProfit Online five years ago. The Executive Edge @ OpenWorld: Of course, my Oracle OpenWorld is mostly filled with meetings/interviews with Oracle customers about completed Oracle projects and the strategic impact of enterprise IT on business. The ideal place for these conversations is The Executive Edge @ OpenWorld embedded conference. Samovar Tea Lounge at Moscone Center: I spend my down time on the roof of Moscone North, preparing for meetings or having impromptu conversations with attendees at this little oasis overlooking Yerba Buena Gardens. Fee free to drop my for a chat! See you in San Francisco! -Aaron Lazenby

    Read the article

  • Wrong source IP when accessing internet directly from TMG server

    - by jarod1701
    Hi everyone, after implementing a ForeFront TMG server I'm facing only one problem: After I added a second IP to the external adapter I had to manually set "NAT Address Selection" inside the network rule "Internet Access" to the first IP since all others would get blocked by the CISCO firewall. This configuration works as long as traffic comes from the internal network (e.g. browser on clients). Traffic from the TMG directed to the internet always carries the second IP as it's source address and gets blocked. All our other TMGs/ISAs are running fine and I never came across this problem- Does anybody have a clue, coz I don't?! Kevin

    Read the article

  • What is the proper SEO handling of pages appearing in popups using IFRAMEs?

    - by Alexis Wilke
    I am working on a CMS which makes use of IFRAMEs to display some forms, for illustration, say a Search form. So the user clicks the Search button and the website reacts by opening a popup window which includes an IFRAME to the actual Search form. This means I have a "bare"¹ page with the search form. Page which, obviously, is directly accessible via its own URI. In terms of SEO, the forms have no content worthy of being indexed, so I was thinking to mark them as NOINDEX. Is that the correct way to handle such pages? From what I read on some other question, Google suggests to put links from IFRAMEs to other pages. However, I definitively do not want a user visible link to the Home page, or whatever page in link with the form, in the content of my forms because that could be misinterpreted by the user. However, if <link> tags would work too, which one should I use? (i.e. "top" would work, right? with the home page in there?) ¹ By bare I mean that the normal theme is not show, it will be a plain white background with just and only the simplest form.

    Read the article

  • Forwarding a subdomain to main domain using Godaddy

    - by Ryan Hayes
    I have current blog, which was hosted on Tumblr at http://blog.ryanhayes.net. I'm moving it over to http://ryanhayes.net, and have all the 301 redirects set up for the blog entries to map to my new blog, which is hosted using Godaddy (domain included). When I try to set up a subdomain forward, I'm greeted with a nice 403 Forbidden response (as of this writing, you can see it at http://blog.ryanhayes.net. When I try to ping both the subdomain and domain, they point to the same IP address, so I know blog subdomain has at least switched over to point to the same content. I don't really understand why I would get a 403 Forbidden on the same content that I can see perfectly fine via another domain. Currently, I have a CNAME of blog pointing to @, which is how "www" is set up to forward, so I'm assuming it would do the same thing. My question is what is the proper way to set up my DNS to make the blog subdomain forward to my main domain (301) using the GoDaddy DNS manager? Bonus: What is the background on why I am getting a 403 error the current way? Forbidden You don't have permission to access / on this server. Additionally, a 403 Forbidden error was encountered while trying to use an ErrorDocument to handle the request. UPDATE 12/7/2010 Error on site has been fixed, you can no longer view it from my site.

    Read the article

  • Domain forwarding to a IE "trusted site" opens a blank page

    - by Michael Jasper
    My employer, a University, regularly hosts conferences and other events. While websites for these sites are hosted on our domain, they frequently request customized .com urls. We then forward these domains to the specific site. Recently, we discovered a problem, where a page will not load if the following conditions are met(using a current example): website is created on our CMS for a conference http://continue.weber.edu/nulc a domain is created http://www.nulc2012.com and forwarded to http://continue.weber.edu/nulc The user enters http://www.nulc2012.com into their address bar using IE7 or IE8 The user has *.weber.edu listed as a "trusted site" in IE security settings (the case for nearly all on-campus computers) When this happens, their browser will correctly transfer to the page http://continue.weber.edu/nulc/index.php, however the page is blank, returning only: <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN"> <HTML><HEAD> <META content="text/html; charset=windows-1252" http-equiv=Content-Type></HEAD> <BODY></BODY></HTML> Is there any know solution to this problem? Or am I missing something completely? Note: Tested websites do load correctly in Chrome, Firefox, and Safari

    Read the article

  • Modify database for the SharePoint 2010 Enterprise Search administration web site

    - by Mark Hall
    Does anyone know how to modify the database settings for the Enterprise Search administration web site? When you configure the service application via Central Administration, SharePoint just decides to use the default database server and gives a name like Enterprise_Search_DB_Identifier. I want to modify this to atleast give a name that makes scense like SharePoint_Search_AdministrationWebContent, and it might be nice to move it to the database server that is hosting the crawl and property database. I figured out how to move the Central Administration web content database, but this database is not listed as a content database. It is listed as a Microsoft.Office.Server.Search.Administration.SearchAdminDatabase. I have not tested to see if the same process would work but because you are doing a RemoveContentDatabase and NewContentDatabase, I would assume not. Any help would be appreciated.

    Read the article

  • Increase backup speed, Backup Exec 2010 - QNAP TS419U+ NAS

    - by user99912
    We have a QNAP NAS and the network shares are being backed up by Backup Exec 2010 over SMB. We can't install the remote agent on the NAS as it has an ARM processor and, as far as I am aware, there is no compatible agent. Do you have any suggestions on any faster method of backing up these shares as opposed to the current scenario? Currently the network bandwidth is not the issue, it seems that this access method is just not able to go any quicker. We've also added the NAS shares to the start of the selection list, but we're still running into 18 hours total backup time (total amount of data on the NAS is roughly 650GB). Any comments and/or suggestions welcome. EDIT: Data is being pulled from the NAS by Backup Exec to a LTO4 tape drive

    Read the article

  • Set up linux box for hosting a-z

    - by microchasm
    I am in the process of reinstalling the OS on a machine that will be used to host a couple of apps for our business. The apps will be local only; access from external clients will be via vpn only. The prior setup used a hosting control panel (Plesk) for most of the admin, and I was looking at using another similar piece of software for the reinstall - but I figured I should finally learn how it all works. I can do most of the things the software would do for me, but am unclear on the symbiosis of it all. This is all an attempt to further distance myself from the land of Configuration Programmer/Programmer, if at all possible. I can't find a full walkthrough anywhere for what I'm looking for, so I thought I'd put up this question, and if people can help me on the way I will edit this with the answers, and document my progress/pitfalls. Hopefully someday this will help someone down the line. The details: CentOS 5.5 x86_64 httpd: Apache/2.2.3 mysql: 5.0.77 (to be upgraded) php: 5.1 (to be upgraded) The requirements: SECURITY!! Secure file transfer Secure client access (SSL Certs and CA) Secure data storage Virtualhosts/multiple subdomains Local email would be nice, but not critical The Steps: Download latest CentOS DVD-iso (torrent worked great for me). Install CentOS: While going through the install, I checked the Server Components option thinking I was going to be using another Plesk-like admin. In hindsight, considering I've decided to try to go my own way, this probably wasn't the best idea. Basic config: Setup users, networking/ip address etc. Yum update/upgrade. Upgrade PHP/MySQL: To upgrade PHP and MySQL to the latest versions, I had to look to another repo outside CentOS. IUS looks great and I'm happy I found it! Add IUS repository to our package manager cd /tmp wget http://dl.iuscommunity.org/pub/ius/stable/Redhat/5/x86_64/epel-release-1-1.ius.el5.noarch.rpm rpm -Uvh epel-release-1-1.ius.el5.noarch.rpm wget http://dl.iuscommunity.org/pub/ius/stable/Redhat/5/x86_64/ius-release-1-4.ius.el5.noarch.rpm rpm -Uvh ius-release-1-4.ius.el5.noarch.rpm yum list | grep -w \.ius\. # list all the packages in the IUS repository; use this to find PHP/MySQL version and libraries you want to install Remove old version of PHP and install newer version from IUS rpm -qa | grep php # to list all of the installed php packages we want to remove yum shell # open an interactive yum shell remove php-common php-mysql php-cli #remove installed PHP components install php53 php53-mysql php53-cli php53-common #add packages you want transaction solve #important!! checks for dependencies transaction run #important!! does the actual installation of packages. [control+d] #exit yum shell php -v PHP 5.3.2 (cli) (built: Apr 6 2010 18:13:45) Upgrade MySQL from IUS repository /etc/init.d/mysqld stop rpm -qa | grep mysql # to see installed mysql packages yum shell remove mysql mysql-server #remove installed MySQL components install mysql51 mysql51-server mysql51-devel transaction solve #important!! checks for dependencies transaction run #important!! does the actual installation of packages. [control+d] #exit yum shell service mysqld start mysql -v Server version: 5.1.42-ius Distributed by The IUS Community Project Upgrade instructions courtesy of IUS wiki: http://wiki.iuscommunity.org/Doc/ClientUsageGuide Install rssh (restricted shell) to provide scp and sftp access, without allowing ssh login cd /tmp wget http://dag.wieers.com/rpm/packages/rssh/rssh-2.3.2-1.2.el5.rf.x86_64.rpm rpm -ivh rssh-2.3.2-1.2.el5.rf.x86_64.rpm useradd -m -d /home/dev -s /usr/bin/rssh dev passwd dev Edit /etc/rssh.conf to grant access to SFTP to rssh users. vi /etc/rssh.conf Uncomment or add: allowscp allowsftp This allows me to connect to the machine via SFTP protocol in Transmit (my FTP program of choice; I'm sure it's similar with other FTP apps). rssh instructions appropriated (with appreciation!) from http://www.cyberciti.biz/tips/linux-unix-restrict-shell-access-with-rssh.html Set up virtual interfaces ifconfig eth1:1 192.168.1.3 up #start up the virtual interface cd /etc/sysconfig/network-scripts/ cp ifcfg-eth1 ifcfg-eth1:1 #copy default script and match name to our virtual interface vi ifcfg-eth1:1 #modify eth1:1 script #ifcfg-eth1:1 | modify so it looks like this: DEVICE=eth1:1 IPADDR=192.168.1.3 NETMASK=255.255.255.0 NETWORK=192.168.1.0 ONBOOT=yes NAME=eth1:1 Add more Virtual interfaces as needed by repeating. Because of the ONBOOT=yes line in the ifcfg-eth1:1 file, this interface will be brought up when the system boots, or the network starts/restarts. service network restart Shutting down interface eth0: [ OK ] Shutting down interface eth1: [ OK ] Shutting down loopback interface: [ OK ] Bringing up loopback interface: [ OK ] Bringing up interface eth0: [ OK ] Bringing up interface eth1: [ OK ] ping 192.168.1.3 64 bytes from 192.168.1.3: icmp_seq=1 ttl=64 time=0.105 ms And this is where I'm at. I will keep editing this as I make progress. Any tips on how to Configure virtual interfaces/ip based virtual hosts for SSL, setting up a CA, or anything else would be appreciated.

    Read the article

  • Windows 7 Computer Maintenance - Restore removed Desktop Shortcuts

    - by HaydnWVN
    I'm aware about the inbuilt nightmate 'tool' of Microsofts which is making every network administrators job a nightmare (if left turned on). In short it'll happily delete every desktop shortcut to a network resource. We have it disabled by default. My question is related to recovering these shortcuts, Windows doesn't seem to dump them into the Recycle Bin and although you can 'restore previous versions' of the Desktop folder, you are unable to do it with the user logged in... Rather a problem if you only have the 1 user account. We have a user who (although the 'Computer Maintenance' tool is turned off) had it come up in 'Action Centre' and chose to run it. I've been unable to reproduce it, but the tool has certainly run (and removed 24 shortcuts from her desktop) although it is still saying 'Windows is not checking your system for maintenance problems'. ?! Very fustrating! Any ideas if these shortcuts are backed up/moved elsewhere so i can easily find and restore them?

    Read the article

  • Why is my eth0 getting a dynamic ip when it is configured to be static?

    - by sdek
    For some reason our office linux box is being assigned an ip address via dhcp and I don't know why. What is confusing to me is that when I check system-config-network it shows that my eth0 is setup to be a static ip address. And /etc/sysconfig/network-scripts/ifcfg-eth0 also shows it is setup to be a static ip, yet it is getting a different ip address than the one specified in the ifcfg-eth0. Let me know if you have any suggestions on or ideas on where I can look next. Here are a few details that might help you figure out what an idiot I am :) Fedora 11 Router in front of this box is running dhcp, starting at 10.42.1.100 This box is configured to be 10.42.1.50 (at least I think it is!), subnet 255.255.255.0 (which is same as the router's lan subnet) Instead of having the static IP, this box is getting assigned 10.42.1.100. Here are the ifcfg-eth0 details DEVICE=eth0 BOOTPROTO=none ONBOOT=yes TYPE=Ethernet USERCTL=no NM_CONTROLLED=no NETMASK=255.255.255.0 IPADDR=10.42.1.50 GATEWAY=10.42.1.1

    Read the article

  • netsh wlan add profile not imported passphrase

    - by sirlancelot
    I exported a wireless network connection profile from a Windows 7 machine correctly connected to a WiFi network with a WPA-TKIP passphrase. The exported xml file shows the correct settings and a keyMaterial node which I can only guess is the encrypted passphrase. When I take the xml to another Windows 7 computer and import it using netsh wlan add profile filename="WiFi.xml", it correctly adds the profile's SSID and encryption type, but a balloon pops up saying that I need to enter the passphrase. Is there a way to import the passphrase along with all other settings?

    Read the article

  • How to see who is using my wifi

    - by Bakhtiyor
    Dear all. Such a question. I have WiFi router at home and would like to know if there is any way to know who is connected to my WiFi network and using it. My WiFi network is password protected but still there could be chances that neighbors crack it and could use my WiFi for free. For this reason I would like to know is there is any tool to inspect the log of WiFi router in order to know which computers are connecting to it. Thanks a lot beforehand.

    Read the article

  • Suggested Resources Visual Studio Plug-In

    Todays post is a quick plug for a new tool developed by my friend Olaf Conijn, who (amongst other things) has been a developer on several versions of Enterprise Library. His new tool is called Suggested Resources for .NET Developers, and the current 0.8 release works with both Visual Studio 2008 and Visual Studio 2010. So what does it do? Well heres what Olaf has to say: This Visual Studio Integration Package is a proof of concept in: Aggregation of online content within the Visual Studio IDE. Analysis of development activities within the Visual Studio IDE. This combination of features allows Suggested Resources for .NET developers to pro-actively suggest online content that applies on the task being performed by a developer... A bit like having a programming pair that searches for online resources while you focus on getting the job done. For more info, screenshots and downloads, head to the Codeplex project site or the Visual Studio Gallery page.Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • How to re-do the hard disks in a WD Word Book Edition II ?

    - by jfmessier
    I recently purchased a WD World Book II, a 2 TB one. I call it the "White Box". It has those 2 1TB drives, and they were in this RAID 1 config, only giving me about 1 TB. I could not delete the raid array, and I took the drives in a Linux box. But I also deleted the entire partitions of the disks, and I cannot even et the existing RAID array on this WD White Box. The drives are fine, but I cannot get them to work on the WD White Box. My goal was to get back to a real 2 TB storage space. If I cannot get those drives back in the White Box, I can re-use them elsewhere, but this would mean a waste of the firmware and network connection. After the fact, I read that, anyway, the network performance is rather poor. Thanks :-)

    Read the article

  • Install remote desktop session host remotely

    - by Jorge
    I've removed the remote desktop host role from one of my Azue VM, so I can't RDP into it. I have tried to recreate the endpoints and even the VM with no luck. I can't either access remotely with Server Manager; getting this error: The client cannot connect to the destination specified in the request. Verify that the services on the destination is running" I cannot connect remotely to the registry Make sure this computer is on the network, has remote administration enabled, and that both computers are running the remote registry service PsPing tells me The remote computer refuses the network connection Which workaround can I follow to solve this?

    Read the article

  • Storing large amounts of small files into bigger files on Windows

    - by asmo
    Let's say I have 50 GiB of files that weights around 500 KiB each. My guess is that having, for example, 5 large files of 10 GiB each with the same content archived in them would be better for hard drive performance. Am I correct? Will there be a noticeable gain on an NTFS filesystem? ===================================================================== Finally, which tool could I use to group the files together while retaining the ability to modify the content of the archive with zero or minor performance loss? For example, I like TrueCrypt archiving because after mounting an archive file, it creates a drive which I can use seamlessly as if it was a normal drive. The only thing with TrueCrypt is that I don't need encryption/compression, only archiving.

    Read the article

  • IE8 Refuses to run Javascript from Local Hard Drive

    - by Josh Stodola
    I have a problem that just started at work recently and the network manager is certain he did not change anything with the group policy. Anyways, here is a detailed description of the problem. My machine is Windows XP SP3, and I use IE8 to browse. We have McAffee anti-virus software that I am unable to configure. I use the following file to test... <!DOCTYPE html> <html> <head> <title>Javascript Test</title> </head> <body> <script type="text/javascript"> document.write("<h1>PASS</h1>"); </script> <noscript> <h1>FAIL</h1> </noscript> </body> </html> When I open this file from the C: drive, it fails every time. If I execute it anywhere else (local/remote web server or on a mapped network drive), it works just fine. When I am simply browsing the Internet, Javascript on web sites works just fine. It is only failing on files running from my C: drive. Additionally, I have had a couple other programmers in the department try this file on their C: drive, and it works fine for them. So I don't believe it is a group policy thing. I need to fix this because I do extensive testing from my C: drive, and I am accustomed to doing so. I don't want to get into the habit of moving files to a different drive just to test. Things I have tried: Enabled "Allow Active Content to Run Files on My Computer" in Options | Advanced | Security Enabled "Allow Active Scripting" in Options | Security | Custom Level Verified that "Script" was not checked as disabled in Developer Toolbar Added localhost to Trusted Sites in Options Disabled McAffee completely (momentarily, with help from network admin) Used an older DOCTYPE in my test HTML page Re-installed IE8 completely Ran regsvr32 on the JScript.dll Slammed keyboard I am sure that there is a setting somewhere that will fix this problem, possibly in the registry. I would not be surprised if it was related to the developer toolbar. At this point I do not know where else to look. Can anyone help me resolve this problem? EDIT: Regardless of the bounty, this issue is still ongoing.

    Read the article

  • Automatically generated /etc/hosts is wrong

    - by Niels Basjes
    I've created a kickstart script to install CentOS 5.5 (32bit) in a fully automated way. The DNS/DHCP setup correctly gives the system the right hostname in both the forward and reverse lookups. dig node4.mydomain.com. +short 10.10.10.64 dig -x 10.10.10.64 +short node4.mydomain.com. In the state the installed system is right after the installation completed is as follows: cat /etc/sysconfig/network NETWORKING=yes NETWORKING_IPV6=yes GATEWAY=10.10.10.1 HOSTNAME=node4.mydomain.com echo ${HOSTNAME} node4.mydomain.com cat /etc/hosts # Do not remove the following line, or various programs # that require network functionality will fail. 127.0.0.1 localhost.localdomain localhost ::1 localhost6.localdomain6 localhost6 10.10.10.64 node4 My problem is that this automatically generated hosts file is slightly different from the way I want it (or better: the way Hadoop wants it). The last line should look like this: 10.10.10.64 node4.mydomain.com node4 What do I modify where to fix this? Thanks.

    Read the article

  • How can I limit CloudFront downloads

    - by Alex Crouzen
    I'm looking to use Amazon's CloudFront to host some content in the near future. Currently, I'm keeping it very simple and I'm just uploading my content to S3 and then making a distribution available via Cloudfront. However, because I have a limited budget, I'd like to be able to limit the number of downloads or the money spent on bandwidth. As far as I can see, I can't set any quotas or budgets like you can in Google's App Engine, so I'm looking for another way of doing this. Has anyone had any experience doing this? One approach I'm thinking of is having to place a webserver with redirects in between, but that kind of defeats the simplicity of CF for me.

    Read the article

  • September OTN Member Offers

    - by Cassandra Clark - OTN
    Oracle OpenWolrd and JavaOne are coming....so the OTN team is knee deep in planning the OTN Lounges that will be at each event this year (more info in another post soon), but we managed to work with our partners to offer a nice BIG list of NEW offers for September.  Visit Oracle Technology Network Member Discount page for codes and links to these great offers! Oracle Press Oracle Technology Network members get 40% off the newest Oracle Press title, Oracle Exalogic Elastic Cloud Handbook by Tom Plunkett! Mike Murach & Associates, Inc. Get 30% off and sample chapter of Murach’s SQL Server 2012 for Developers by Bryan Syverson and Joel Murach Manning - 41% off titles below and sample chapter of each. Making Java Groovy OCA Java SE 7 Programmer I Certification Guide Apress - Get 30% off on apress.com on Java 7 Recipes A Problem-Solution Approach Safari Books Online - OTN members get 30 days of free access + 20% off unlimited access to Safari Books Online for 6 months. Safari Books Online offers subscription access to more than 24,000 books and training videos about technology, digital media, business management and professional development from leading publishers such as Oracle Press, O'Reilly Media, Que, Addison-Wesley, Wrox, Cisco Press, Microsoft Press, McGraw Hill, Wiley, Apress, Adobe Press and many others. Already a customer? Come see us at Oracle OpenWorld (booth 537) or JavaOne (5110) and mention this to get a shirt!

    Read the article

  • ubuntu server 10 - slow and can't remove desktop environment

    - by Alex
    i'm running ubuntu server 10.10 with the desktop environment. simple page requests are taking over 5 seconds even when connecting to the server through our local network. i believe this is partially related to having the desktop environment installed, as the server worked faster (but not as fast as it should considering that it's on the local network), but tasksel fails every time (aptitude failed 100). my knowledge of networking and linux in general is limited. would really appreciate ideas on how i can troubleshoot this problem. oh also, in the system monitor, one of the processors is almost always around 100%. i doubt this is normal too....

    Read the article

  • no way to use opendns on pppoe connection?

    - by magisterludi
    I have an old speedtouch usb modem (revision 0) and on my desktop with xubuntu 12.04 I've configured a pppoe connection. I can connect and my ISP assign an IP address and the DNS but the primary DNS address is not reachable by ping, the secondary yes but no address is resolved then I can't surf the web. Then I want to set the open DNS but there is bo way, if I change manually /etc/resolv.conf it is rewrited by some script (there is the flag usepeerdns on the configuration script, if I exclude it there is no way to assign any DNS server because resolv.conf is not read) also if I set not writable the file changing the default permission. I changed dhclient.conf with the code prepend domain-name-servers 208.67.222.222,208.67.220.220; and now if I connect by a wifi connection to my router I'm using openDNS server but ppp does not use this script as long as I can see and the DNS server is always setted by my ISP. How can I use set DNS manually to a PPP connection? Is there any way to change it after the connection? Why NetworkManager is not able to manage my dsl connection, it seems not able to manage the dsl usb cable modem. If I use pppoeconf NetworkManager doesn't start and I have to manually delete the lines added to /etc/network/interfaces because the system is not able to start with full configuration of network If I connect a modem-router to the same line I can surf with the DNS server assigned by my ISP, I can't figure why. Some suggestion? Thanks to all

    Read the article

< Previous Page | 589 590 591 592 593 594 595 596 597 598 599 600  | Next Page >