Search Results

Search found 2346 results on 94 pages for 'dedicated'.

Page 64/94 | < Previous Page | 60 61 62 63 64 65 66 67 68 69 70 71  | Next Page >

  • Using all Ten IO slots on a 7420

    - by user12620172
    So I had the opportunity recently to actually use up all ten slots in a clustered 7420 system. This actually uses 20 slots, or 22 if you count the clusteron card. I thought it was interesting enough to share here. This is at one of my clients here in southern California. You can see the picture below. We have four SAS HBAs instead of the usual two. This is becuase we wanted to split up the back-end taffic for different workloads. We have a set of disk trays coming from two SAS cards for nothing but Exadata backups. Then, we have a different set of disk trays coming off of the other two SAS cards for non-Exadata workloads, such as regular user file storage. We have 2 Infiniband cards which allow us to do a full mesh directly into the back of the nearby, production Exadata, specifically for fast backups and restores over IB. You can see a 3rd IB card here, which is going to be connected to a non-production Exadata for slower backups and restores from it.The 10Gig card is for client connectivity, allowing other, non-Exadata Oracle databases to make use of the many snapshots and clones that can now be created using the RMAN copies from the original production database coming off the Exadata. This allows for a good number of test and development Oracle databases to use these clones without effecting performance of the Exadata at all.We also have a couple FC HBAs, both for NDMP backups to an Oracle/StorageTek tape library and also for FC clients to come in and use some storage on the 7420.  Now, if you are adding more cards to your 7420, be aware of which cards you can place in which slots. See the bottom graphic just below the photo.  Note that the slots are numbered 0-4 for the first 5 cards, then the "C" slots which is the dedicated Cluster card (called the Clustron), and then another 5 slots numbered 5-9. Some rules for the slots: Slots 1 & 8 are automatically populated with the two default SAS cards. The only other slots you can add SAS cards to are 2 & 7. Slots 0 and 9 can only hold FC cards. Nothing else. So if you have four SAS cards, you are now down to only four more slots for your 10Gig and IB cards. Be sure not to waste one of these slots on a FC card, which can go into 0 or 9, instead.  If at all possible, slots should be populated in this order: 9, 0, 7, 2, 6, 3, 5, 4

    Read the article

  • What's Happening in Business Analytics at OpenWorld 2012?

    - by jmorourke
    Oracle OpenWorld 2012 is rapidly approaching on September 30th when we take over the city of San Francisco for five days.  The Business Analytics this year is our strongest ever with over 150 EPM, BI, Analytics and Data Warehousing sessions delivered by Oracle, our customers and partners.  We’ll also have Hands-On Labs, 20 demo pods dedicated to Business Analytics products, and over 30 partners exhibiting their solutions.  So what’s hot in the Business Analytics program at OpenWorld?  Here are some of the “can’t miss” sessions at this year’s conference: The EPM and BI general sessions, led by SVP of Product Development Balaji Yelamanchili will highlight what’s new provide a view into Oracle’s EPM, BI and Analytics strategies.  Both sessions are scheduled on Monday, October 1st. Thursday Keynote:  See More, Act Faster:  Oracle Business Analytics, led by Oracle President Mark Hurd, will provide a view into Oracle’s strategy for Business Analytics, especially engineered systems designed to provide extreme performance for the most rigorous analytic tasks. Superfast Business Intelligence with Oracle Exalytics.  Hear about various business intelligence scenarios in which Oracle Exalytics provides exemplary value—from operational reporting and prepackaged applications to analytics on unstructured data. Turn Insights into Real-Time Actions with Oracle Business Intelligence Mobile.  Learn how Oracle Business Intelligence Mobile enables organizations to deliver relevant information and turn insight into real-time action, no matter where employees are located. Empowering the Business User: Introduction to Oracle Endeca Information Discovery.  Find out how you can find fast answers to the new questions that confront your business every day, while avoiding the confusion and inconsistencies brought about by spreadsheets and desktop tools. Big Data:  The Big Story.  Learn how to harness big data, your existing data, and predictive analytics to make better decisions in an environment of rapid shifts in behavior and instant feedback.  Learn about the technologies that constitute a big data architecture, how to leverage and implement advanced analytics for real-time decisions, and the tools needed to know the unknown. Planning at the Speed of Business with Oracle Exalytics.  Learn how Oracle Hyperion Planning leverages the power of Oracle Exalytics to do planning faster, with more detail and more users than ever. For more details on these and other Business Analytics sessions at OpenWorld, download the Focus On Business Analytics program guide at:  http://www.oracle.com/openworld/focus-on/index.html We look forward to seeing you in San Francisco!

    Read the article

  • Encapsulating code in F# (Part 1)

    - by MarkPearl
    I have been looking at F# for a while now and seem a few really interesting samples and snippets on howto’s. This has been great to see the basic outline of the language and the possibilities, however a nagging question in the back of my mind has been what does an F# project look like? How do I code group code in F# so that it can be modularized and brought in and out of a project easily? My Expert F# book has an entire chapter (7) dedicated to this and after browsing the other chapters of the book I decided that this topic was something I really wanted to know more about now! Because of my C# background I keep trying to think in F# of objects. So to try and get a clearer idea of how to do things the F# way I am first going to take a very simplified C# example and try to “translate” it. using System; namespace ConsoleApplication1 { namespace ExampleOfEncapsulationInCSharp { class Program { static void EncapsulatedVariableInAMethod() { int count = 10; Console.WriteLine(count); } static void Main(string[] args) { EncapsulatedVariableInAMethod(); Console.ReadLine(); } } } } From the above example the count integer is encapsulated within EncapsulatedVariableInAMethod method. You couldn’t access the count variable from outside the scope of its parent method but have full access to it within the method. Lets look at my F# equivalent… open System let EncapsulatedVariableInAMethod = let count = 10 Console.WriteLine(count) () EncapsulatedVariableInAMethod Console.ReadLine()   Now, when I first attempted to write the F# code I got stuck… I didn’t have the Console.WriteLine calls but had the following… open System let EncapsulatedVariableInAMethod = let count = 10 EncapsulatedVariableInAMethod Console.ReadLine()   The compiler didn’t like the let before the count = 10. This is because every F# expression must evaluate to a value. If I did not want to make the Console call, I would still need to evaluate the expression to something – and for this reason the Unit Type is provided. I could have done something like…. open System let EncapsulatedVariableInAMethod = let count = 10 () EncapsulatedVariableInAMethod Console.ReadLine()   Which the compiler would be happy with…

    Read the article

  • Cloud Computing Business Benefits

    - by workflowman
    If you have been living under a rock for the past year, you wouldn't have heard about cloud computing. Cloud computing is a loose term that describes anything that is hosted in data centers and accessed via the internet. It is normally associated with developers who draw clouds in diagrams indicating where services or how systems communicate with each other. Cloud computing also incorporates such well-known trends as Web 2.0 and Software as a Service (SaaS) and more recently Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). Its aim is to change the way we compute, moving from traditional desktop and on-premises servers to services and resources that are hosted in the cloud.  Benefits of Cloud Computing  There are clearly benefits in building applications using cloud computing, some of which are listed here:  Zero up- front investment:  Delivering a large-scale system costs a fortune in both time and money. Often IT departments are split into hardware/network and software services. The hardware team provisions servers and so forth under the requirements of the software team. Often the hardware team has a different budget that requires approval. Although hardware and software management are two separate disciplines, sometimes what happens is developers are given the task to estimate CPU cycles, disk space, and so forth, which ends up in underutilized servers.  Usage-based costing:  You pay for what you use, no more, no less, because you never actually own the server. This is similar to car leasing, where in the long run you get a new car every three years and maintenance is never a worry.  Potential for shrinking the processing time:  If processes are split over multiple machines, parallel processing is performed, which decreases processing time.  More office space:  Walk into most offices, and guaranteed you will find a medium- sized room dedicated to servers.  Efficient resource utilization:  The resource utilization is handed by a centralized cloud administrator who is in charge of deciding exactly the right amount of resources for a system. This takes the task away from local administrators, who have to regularly monitor these servers.  Just-in-time infrastructure:  If your system is a success and needs to scale to meet demand, this can cause further time delays or a slow- performing service. Cloud computing solves this because you can add more resources at any time.  Lower environmental impact:  If servers are centralized, potentially an environment initiative is more likely to succeed. As an example, if servers are placed in sunny or windy parts of the world, then why not use these resources to power those servers?  Lower costs:  Unfortunately, this is one point that administrators will not like. If you have people administrating your e-mail server and network along with support staff doing other cloud-based tasks, this workforce can be reduced. This saves costs, though it also reduces jobs.

    Read the article

  • Developing web sites that imitate desktop apps. How to fight that paradigm? [closed]

    - by user1598390
    Supposse there's a company where web sites/apps are designed to resemble desktop apps. They struggle to add: Splash screens Drop-down menus Tab-pages Pages that don't grow downward with content, context is inside scrollable area so page is of a fixed size, as if resembling the one-screen limitation of desktop apps. Modal windows, pop-ups, etc. Tree views Absolutely no access to content unless you login-first, even with non-sensitive content. After splash screen desapears, you are presented with a login screen. No links - just simulated buttons. Fixed page-size. Cannot open a linked in other tab Print button that prints directly ( not showing printable page so the user can't print via the browser's print command ) Progress bars for loading content even when the browser indicates it with its own animation Fonts and color amulate a desktop app made with Visual Basic, PowerBuilder etc. Every app seems almost as if were made in Visual Basic. They reject this elements: Breadcrumbs Good old underlined links Generated/dynamic navigation, usage-based suggestions Ability to open links in multiple tabs Pagination Printable pages Ability to produce a URL you can save or share that links to an item, like when you send someone the link to an especific StackExchange question. The only URL is the main one. Back button To achieve this, tons of javascript code is needed. Lots and lots of Javascript and Ajax code for things not related with the business but with the necessity to hide/show that button, refresh this listbox, grey-out that label, etc. The coplexity generated by forcing one paradigm into another means most lines of code are dedicated to maintain the illusion of a desktop app. What is the best way to change this mindset, and make them embrace the web, and start producing modern, web apps instead of desktop imitations ? EDIT: These sites are intranet sites. Users hate these apps. They constantly whine about them, but they have to use them to do their daily work. These sites are in-house solutions, the end-users have no choice but to use them. They are a "captive audience". Also, substitution will not happen because of high costs. But at least if that mindset is changed, new developments would be more web-like.

    Read the article

  • "Collection Wrapper" pattern - is this common?

    - by Prog
    A different question of mine had to do with encapsulating member data structures inside classes. In order to understand this question better please read that question and look at the approach discussed. One of the guys who answered that question said that the approach is good, but if I understood him correctly - he said that there should be a class existing just for the purpose of wrapping the collection, instead of an ordinary class offering a number of public methods just to access the member collection. For example, instead of this: class SomeClass{ // downright exposing the concrete collection. Things[] someCollection; // other stuff omitted Thing[] getCollection(){return someCollection;} } Or this: class SomeClass{ // encapsulating the collection, but inflating the class' public interface. Thing[] someCollection; // class functionality omitted. public Thing getThing(int index){ return someCollection[index]; } public int getSize(){ return someCollection.length; } public void setThing(int index, Thing thing){ someCollection[index] = thing; } public void removeThing(int index){ someCollection[index] = null; } } We'll have this: // encapsulating the collection - in a different class, dedicated to this. class SomeClass{ CollectionWrapper someCollection; CollectionWrapper getCollection(){return someCollection;} } class CollectionWrapper{ Thing[] someCollection; public Thing getThing(int index){ return someCollection[index]; } public int getSize(){ return someCollection.length; } public void setThing(int index, Thing thing){ someCollection[index] = thing; } public void removeThing(int index){ someCollection[index] = null; } } This way, the inner data structure in SomeClass can change without affecting client code, and without forcing SomeClass to offer a lot of public methods just to access the inner collection. CollectionWrapper does this instead. E.g. if the collection changes from an array to a List, the internal implementation of CollectionWrapper changes, but client code stays the same. Also, the CollectionWrapper can hide certain things from the client code - from example, it can disallow mutation to the collection by not having the methods setThing and removeThing. This approach to decoupling client code from the concrete data structure seems IMHO pretty good. Is this approach common? What are it's downfalls? Is this used in practice?

    Read the article

  • How to disable monitor auto detection in Windows 7?

    - by Jay Yother
    I am currently running Windows 7 Ultimate 64-bit with a dual monitor setup with an NVIDIA 7950 GT graphics card. One monitor is dedicated to this machine and the other monitor is connected to a DVI KVM switch. When I switch to my other computer, Windows 7 disables the monitor. However, when I switch back it does not re-enable the monitor. The only circumstance that automatically re-enables the second monitor is when I switch back after Windows has put the monitors into power save mode. I am continually having to bring up the NVIDIA control panel to have it re-enable the monitor. Under Windows XP I would just disable the NVIDIA service to prevent it from auto-detecting the monitor (which doesn't solve the problem under Win7), and in Vista there was a registry hack that would prevent this. It looks as though that has been removed in Windows 7. I have found similar questions posted on this site, but nothing that matches my problem exactly. The following link is the question that comes the closest, but does not provide a solution to the problem. http://superuser.com/questions/96683/how-to-fix-monitor-detection-on-windows-7 Is there a way in Windows 7 to disable monitor auto-detection?

    Read the article

  • Problem with Email Notifications in VisualSVN Server

    - by emzero
    Hey guys! I have a dedicated server running windows 2003 server and Visual SVN Server 2.0.8. I'm trying to configure it to send email notifications on commit. So I found this article on Visual SVN site. It says I have to edit the Post-commit hook and set it to the following: "%VISUALSVN_SERVER%\bin\VisualSVNServerHooks.exe" ^ commit-notification "%1" -r %2 ^ --from <from-email> --to <to-email> ^ --smtp-server <smtp-server> Of course I've replaced the variables there. The problem is when someone commits something, the svn client throws the following error: post-commit hook failed (exit code 1) with no output. The commit process runs with no problems, I mean it does commit the files. But it won't send any email notification. If I remove the post-commit hook, then I don't get the error (and of course I don't get any notification). Could you help me out with it? The error doesn't tell too much =S Thank you!

    Read the article

  • i5 540M or i7 720QM for laptop running VMs and software development tools?

    - by Donald Hughes
    I'm a software developer that would primarily be running Windows 7 as the primary operating system. On a typical day, I might, at any given moment, be running Visual Studio, Expression Web, SQL Server developer (and Management Console), IIS, Photoshop, a dozen browser tabs in 2-3 different browsers, Skype video chat, streaming music, and a couple of VMs (WinXP and Ubuntu) for testing/experimentation. Obviously, RAM is a concern, which is why I plan to use 8 GB so I can devote enough to the VMs to be usable. I'm also tempted to use an ExpressCard SSD for storing the VM disks to ease disk contention. And I know that that is asking a lot from a laptop, and I should just use a desktop, but I need to be able to take my work with me between several locations. It seems that at a reasonable price point, it comes down to the i5 540M versus the i7 720QM. I'm leaning toward the i7 since it would allow me to dedicate a whole hyperthreaded core to each VM, and still have two cores left for the primary OS. I've heard that the i5 has better battery life, but I'm curious for my scenario if there would be a meaningful difference. I don't usually work without a plug, but I do occasionally ride the train or fly and it would be nice to have at least 3 hours of juice for unusual circumstances. And, finally, for this usage scenario, would a dedicated video option be preferred over the i5's integrated video? It sounds like Visual Studio 2010 (and Windows 7) can take advantage of the video card.

    Read the article

  • extreme slowness with a remote database in Drupal

    - by ceejayoz
    We're attempting to scale our Drupal installations up and have decided on some dedicated MySQL boxes. Unfortunately, we're running into extreme slowness when we attempt to use the remote DB - page load times go from ~200 milliseconds to 5-10 seconds. Latency between the servers is minimal - a tenth or two of a millisecond. PING 10.37.66.175 (10.37.66.175) 56(84) bytes of data. 64 bytes from 10.37.66.175: icmp_seq=1 ttl=64 time=0.145 ms 64 bytes from 10.37.66.175: icmp_seq=2 ttl=64 time=0.157 ms 64 bytes from 10.37.66.175: icmp_seq=3 ttl=64 time=0.157 ms 64 bytes from 10.37.66.175: icmp_seq=4 ttl=64 time=0.144 ms 64 bytes from 10.37.66.175: icmp_seq=5 ttl=64 time=0.121 ms 64 bytes from 10.37.66.175: icmp_seq=6 ttl=64 time=0.122 ms 64 bytes from 10.37.66.175: icmp_seq=7 ttl=64 time=0.163 ms 64 bytes from 10.37.66.175: icmp_seq=8 ttl=64 time=0.115 ms 64 bytes from 10.37.66.175: icmp_seq=9 ttl=64 time=0.484 ms 64 bytes from 10.37.66.175: icmp_seq=10 ttl=64 time=0.156 ms --- 10.37.66.175 ping statistics --- 10 packets transmitted, 10 received, 0% packet loss, time 8998ms rtt min/avg/max/mdev = 0.115/0.176/0.484/0.104 ms Drupal's devel.module timers show the database queries aren't running any slower on the remote DB - about 150 microseconds whether it's the local or the remote server. Profiling with XHProf shows PHP execution times that aren't out of whack, either. Number of queries doesn't seem to make a difference - we seem the same 5-10 second delay whether a page has 12 queries or 250. Any suggestions about where I should start troubleshooting here? I'm quite confused.

    Read the article

  • How do I upgrade PHP on CentOS and Kloxo?

    - by Emerson
    I need to upgrade PHP so that I can upgrade joomla on my dedicated server. I have: kloxo 6.1.6 php-5.2.17-1 Linux CentOS-55-64-minimal 2.6.18-194.32.1.el5 x86_64 x86_64 x86_64 GNU/Linux I searched everywhere and I could only find that PHP 5.3 isn't compatible with zend. I would like to upgrade to 5.2.4, which is the minimum for joomla 1.6 and 1.7. I tried to run: yum update php.x86_64 Which is the PHP package installed, but it didn't work. This is a production server with quite a few users across many sites, so I wanted to do it as safely as possible. Is it safe to run "yum update"? It showed me 6 packages to install and 125 packages to update, including a kernel. Is that safe? I haven't touched kloxo's yum repositories. Update: I just successfully ran "yum update". Now I think I need to know how to add a new repository that has the 5.2.4 and how to update to that specific version. Any ideas?

    Read the article

  • Router for creating site to site VPN to server provider using Cisco ASA 5540

    - by fondie
    We have dedicated servers hosted for us by a third party, we connect to these over a VPN. My server provider uses Cisco ASA 5540 as VPN devices. Currently we're using software clients on individual machines to connect to this VPN, either: Cisco VPN Client Shrew Soft VPN Connect However, I'm looking to purchase a new load balancing router for our office and thought this could be an opportunity to get VPN client duties taken over by hardware. We could then create a permanent VPN tunnel that could be used by anyone on the network with no software client necessary. Sadly I'm not the most knowledgeable on this kind of stuff so is: 1) This a realizable goal? Next I need to know what kind of hardware I will need. I'm not looking to spend lots of money on this (~$500), so doubtful I can afford any Cisco kit. Therefore, this is the most promising candidate I've seen (as far as my limited knowledge goes): Draytek Vigor 2955 - http://www.draytek.co.uk/products/vigor2955.html 2) Would this be compatible with the Cisco kit my server provider uses? 3) If not, are there any alternatives I should consider? Many thanks in advance.

    Read the article

  • Apache+LDAP auth on Ubuntu says "Can't contact LDAP server" while ldapsearch is perfect

    - by tw79
    Hi Gurus, I'm migrating from an existing apache+LDAP+mysql+php server to a new hardware platform. Old server is running Debian Lenny, which I have no config documentation available (was done by previous sysadmin); New server is running Ubuntu 10.04.2 LTS 32bit. After installing Apache and configured LDAP client on the new server, ldapsearch to the LDAP master (another dedicated server) returns results just fine. However, when using apache with https, logs complain that "Can't contact LDAP server". I'm authenticating using ldaps and can confirm that 636 port is open on the LDAP master. I can't understand why apache would fail while regular ldapsearch is working! Below is part of the virtualhost config: <Directory /> Options FollowSymLinks AllowOverride None #AuthLDAPEnabled on AuthType Basic AuthBasicProvider ldap AuthName "Private" AuthLDAPURL ldaps://master.ldap.organisation.com:636/ou=people,dc=organisation,dc=com?uid AuthzLDAPAuthoritative off require valid-user AddType application/x-httpd-php .php .phtml <IfModule mod_php4.c> php_flag magic_quotes_gpc Off php_flag track_vars On php_value include_path . </IfModule> </Directory> Any help/suggestion is very much appreciated!

    Read the article

  • FTP timing out after login

    - by Imran
    For some reasons I cant access any of my accounts on my dedicated server via FTP. It simply times out when it tried to display the directories. Heres a log from FileZila... Status: Resolving address of testdomain.com Status: Connecting to 64.237.58.43:21... Status: Connection established, waiting for welcome message... Response: 220---------- Welcome to Pure-FTPd [TLS] ---------- Response: 220-You are user number 3 of 50 allowed. Response: 220-Local time is now 19:39. Server port: 21. Response: 220-This is a private system - No anonymous login Response: 220-IPv6 connections are also welcome on this server. Response: 220 You will be disconnected after 15 minutes of inactivity. Command: USER testaccount Response: 331 User testaccount OK. Password required Command: PASS ******** Response: 230-User testaccount has group access to: testaccount Response: 230 OK. Current restricted directory is / Command: SYST Response: 215 UNIX Type: L8 Command: FEAT Response: 211-Extensions supported: Response: EPRT Response: IDLE Response: MDTM Response: SIZE Response: REST STREAM Response: MLST type*;size*;sizd*;modify*;UNIX.mode*;UNIX.uid*;UNIX.gid*;unique*; Response: MLSD Response: ESTP Response: PASV Response: EPSV Response: SPSV Response: ESTA Response: AUTH TLS Response: PBSZ Response: PROT Response: 211 End. Status: Connected Status: Retrieving directory listing... Command: PWD Response: 257 "/" is your current location Command: TYPE I Response: 200 TYPE is now 8-bit binary Command: PASV Response: 227 Entering Passive Mode (64,237,58,43,145,153) Command: MLSD Response: 150 Accepted data connection Response: 226-ASCII Response: 226-Options: -a -l Response: 226 18 matches total Error: Connection timed out Error: Failed to retrieve directory listing I have restarted the FTP service serveral times but still It doesnt loads. I only have this problem when my server is reaching it peak usage which is still only 1.0 (4 cores), 40% of 4GB ram. The ftp connections isnt maxed out because only me and my colleague have access to FTP on the server.

    Read the article

  • What is there in Win 7 Pro (or Ultimate) that is not there in Home Premium? - Especially considering this situation..

    - by Senthil
    I want to know the REAL difference between Windows 7 Home Premium and Professional/Utimate. In India, the cost of different versions: Ultimate - 11,200 INR Professional - 10,700 INR Home Premium - 6,600 INR The absolute cost of the first two is so high to me that the difference (500 INR) doesn't matter. So to me there is really no choice between the first two - If I decide to buy the Professional version, I'd rather go for Ultimate itself. What I want to know is, whether Home Premium is enough for my needs. I tried searching for comparison but many look like just marketing junk from MS. They are short and vague. According to this page, the major differences between Pro and HomePremium are Run many Windows XP productivity programs in Windows XP Mode. Connect to company networks easily and more securely with Domain Join. You can do both in Pro but not in Home Premium. I intend to use my Windows 7 for a small business - just starting up. So I'll be dealing with the following: All kinds of development tools, servers Very important - I will run Virtual Machine Software (MS VPC or VMWare or Sun VirtualBox etc..) My system will be acting as the server for most purposes till I can afford dedicated servers. Connect the system to a variety of network devices (PCs, Printers, etc..) Run productivity, business and financial apps Any other small software startup business requirement that I haven't thought of yet. Professional (and Ultimate) is twice as expensive as Home Premium. So it'd be great if someone can point out the things you cannot do with Home Premium, when you use it like I explained above, so that I can make a decision about which one to buy. I need some real-life experiences so that I can make an informed decision - not a decision based on marketing junk.

    Read the article

  • Web based KVM management for Ubuntu

    - by Tim
    Hey all, We've got a single Ubuntu 9.10 root server on which we want to run multiple KVM virtual machines. To administer these virtual machines I'd like a web based KVM management tool, but I don't know which one to choose from the list of tools mentioned on linux-kvm.org. I've used virsh & virt-manager on my desktop, but would like a web interface for the server. I tested ConVirt on my desktop, but it failed to pickup KVM machines from virsh / virt-manager, and I could not get KVM virtual machine import to work (only Xen). oVirt looks good, but I can't find out if and how I can install it on Ubuntu 9.10.. (And I'd really rather not waste another few days on testing stuff that might not work in the end.) Can anyone recommend any good web based KVM management tools that are easy to install on Ubuntu 9.10? I'm looking for something that will also allow me to run other services like apache and postgresql besides hosting virtual machines, so preferably fairly lightweight & no dedicated OS installs. We don't need any professional clustering / migration or anything, just something that will let us create, start, inspect, administer & stop virtual machines from a web page. Best regards, Tim

    Read the article

  • Postfix mail server: can't connect via POP/IMAP

    - by MelkerOVan
    I've followed this guide on setting up a mail server on my dedicated server. I've been able to send mails from the php application I'm using and the linux commandline (using telnet, php, etc). The problem is that I cannot connect to the server via IMAP/POP which I've setup using Courier. I've tried using thunderbird but it complains that the username or password is wrong. I doubt it is the username/password but I don't know how to trouble shoot this. Edit: Here's the messages in mail.log: Jan 9 22:43:38 mail authdaemond: received auth request, service=imap, authtype=login Jan 9 22:43:38 mail authdaemond: authmysql: trying this module Jan 9 22:43:38 mail authdaemond: SQL query: SELECT id, crypt, "", uid, gid, home, "", "", name, "" FROM users WHERE id = '[email protected]' AND (enabled=1) Jan 9 22:43:38 mail authdaemond: password matches successfully Jan 9 22:43:38 mail authdaemond: authmysql: sysusername=<null>, sysuserid=5000, sysgroupid=5000, homedir=/var/spool/mail/virtual, [email protected], fullname=peter, maildir=<null>, quota=<null>, options=<null> Jan 9 22:43:38 mail authdaemond: authmysql: clearpasswd=<null>, passwd=6SrBcYq65l8QU Jan 9 22:43:38 mail authdaemond: Authenticated: sysusername=<null>, sysuserid=5000, sysgroupid=5000, homedir=/var/spool/mail/virtual, [email protected], fullname=peter, maildir=<null>, quota=<null>, options=<null> Jan 9 22:43:38 mail authdaemond: Authenticated: clearpasswd=peter, passwd=6SrBcYq65l8QU Jan 9 22:43:38 mail imapd: chdir Maildir: No such file or directory

    Read the article

  • Sync OneNote Notebooks to/on SkyDrive

    - by Sam
    I've got OneNote running on all computers in our house, using it all the time with several people and computers. The only drawback: I want to keep the copies of OneNote in sync without having to run a dedicated server myself. Right now one of my computers has a folder share, where all others sync to, but this is highly impractical since the computer is not always running. So my question is: is it possible to put the notebook files on a (private) SkyDrive Folder and have all the computers sync to there? This way all computers could keep in sync whenever they got access to the web. Can this be done? and, of course, How? [Update] Maybe I should not have taken knowledge about OneNote as granted: OneNote uses a propietary file format, but has a very good in-file-syncing, working on network shares. Generic 'just sync the complete file' won't be useful at all, because I'd just have 'file has changed on server and on client' conflicts all the time. The sync needs to know OneNote files and be able to sync the content - eg. OneNote itself needs to sync the files, not some generic sync tool.

    Read the article

  • TeamCity EC2 Integration via ISA Server

    - by Tim Long
    I have a TeamCity server which is actually installed on SBS 2003 Premium with ISA Server (firewall/proxy) installed. My ADSL connection has multiple IP addresses, which all resolve directly to my SBS external NIC. The NIC is therefore multi-homed and I have allocated one of the IP addresses specifically to TeamCity. In ISA, I've created an access rule to allow the traffic in. I can access my TeamCity server externally and view the web interface, that all works fine. I want to use the Amazon EC2 integration in TeamCity to launch build agents 'in the cloud'. The problem I am having is that when the agent starts, it sees the server and registers, then just sits there waiting. On the server side, the agent appears as 'disconnected'. Examining the settings, the agent's IP address appears to be that of the external NIC. What I think might be happening is that the traffic is undergoing Network Address Translation (NAT) so that TeamCity always thinks the agent is locally installed and therefore can't communicate with the actual remote agent. This seems to happen even though I have a permanent static IP address dedicated to TeamCity. So, the question is this. How can I make traffic to a specific IP address pass through the ISA server un-NATted?

    Read the article

  • Issue configuring Oracle database for SSL

    - by Santhosha Kaldambe
    Hello, I want to setup Oracle for SSL communication. I am not using SSL authentication for database user. As first requirement, generated self signed certificate using OpenSSL and added certificate to wallet. The wallet location is specified in server configuration. Created listener and it is starting however it does not provide any service. The default listener (non SSL) is working fine. When I execute LSNRCTL.EXE status SSLLISTENER it gives below output. STATUS of the LISTENER Alias SSLLISTENER Version TNSLSNR for 32-bit Windows: Version 11.1.0.6.0 - Production Start Date 14-NOV-2009 01:47:08 Uptime 16 days 22 hr. 14 min. 3 sec Trace Level off Security ON: Local OS Authentication SNMP OFF Listener Parameter File C:\app\Administrator\product\11.1.0\db_1\network\admin\listener.ora Listener Log File c:\app\administrator\diag\tnslsnr\\ssllistener\alert\log.xml Listening Endpoints Summary... (DESCRIPTION=(ADDRESS=(PROTOCOL=tcps)(HOST=)(PORT =2484))) The listener supports no services The command completed successfully Here is exact content of various files after configuration. 1) File Name: tnsnames.ora ORCL = (DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TCP)(HOST = )(PORT 1521)) ) (CONNECT_DATA = (SERVER = DEDICATED) (SERVICE_NAME = orcl) ) ) 2) File Name: sqlnet.ora SSL_VERSION = 0 NAMES.DIRECTORY_PATH= (TNSNAMES, EZCONNECT) sqlnet.authentication_services= (NONE) tcp.validnode_checking = no tcp.invited_nodes=(PS0803.oraebs.com,PS2948,PS5098) SSL_CLIENT_AUTHENTICATION = FALSE WALLET_LOCATION = (SOURCE = (METHOD = FILE) (METHOD_DATA = (DIRECTORY = C:\app\Administrator\admin\orcl\Server_Wallet) ) ) 3) File Name: listener.ora SSL_CLIENT_AUTHENTICATION = FALSE WALLET_LOCATION = (SOURCE = (METHOD = FILE) (METHOD_DATA = (DIRECTORY = C:\app\Administrator\admin\orcl\Server_Wallet) ) ) LISTENER = (DESCRIPTION_LIST = (DESCRIPTION = (ADDRESS = (PROTOCOL = IPC)(KEY = EXTPROC1521)) ) (DESCRIPTION = (ADDRESS = (PROTOCOL = TCP)(HOST = )(PORT 1521)) ) ) SSLLISTENER = (DESCRIPTION = (ADDRESS = (PROTOCOL = TCPS)(HOST = )(PORT = 2484)) ) Thanks Santhosh

    Read the article

  • Exim service cPanel error

    - by Luka
    I cleaned out some logs from my cPanel dedicated server From here http://linuxhostingsupport.net/blog/log-files-on-a-cpanel-server i deleted all log listed at that link. Problem is with EXIM process it can not shut down, but it can run. When I try to send Email from roundcube, horde or via smtp it is down. 25 port is down, I can not receive, or send mails. But 1 minute before cleaning logs I received mails and I could send mails. what is problem, I just deleted logs... When I try service exim restart. I get: Shutting down clamd: [ OK ] Shutting down exim: [FAILED] Shutting down spamd: [ OK ] Starting clamd: [ OK ] Starting exim: [ OK ] 0 processes (antirelayd) sent signal 9 /usr/local/cpanel/scripts/update_sa_rules: running in background Exim log: 2012-10-20 03:06:14 cwd=/ 3 args: /usr/sbin/exim -bd -q1h 2012-10-20 03:06:24 cwd=/ 3 args: /usr/sbin/exim -bd -q1h 2012-10-20 03:06:32 cwd=/ 3 args: /usr/sbin/exim -bd -q1h 2012-10-20 03:06:34 cwd=/ 2 args: /usr/sbin/sendmail -t 2012-10-20 03:08:20 cwd=/ 3 args: /usr/sbin/exim -bd -q1h 2012-10-20 03:11:37 cwd=/ 2 args: /usr/sbin/sendmail -t 2012-10-20 03:13:45 cwd=/ 3 args: /usr/sbin/exim -bd -q1h 2012-10-20 03:14:01 cwd=/ 3 args: /usr/sbin/exim -bd -q1h 2012-10-20 03:14:28 cwd=/home/pegaz/public_html 3 args: /usr/sbin/sendmail -t -i 2012-10-20 03:21:43 cwd=/ 3 args: /usr/sbin/exim -bd -q1h

    Read the article

  • SAN with iSCSI-Target Performance Horrendous

    - by Justin
    We have a poor man's SAN setup in a 1U Ubuntu server running iSCSI-Target with two 300GB drives in RAID-0. We then are using it for block level storage for virtual machines. The hypervisor is connected to the SAN via gigabit on a dedicated VLAN and interfaces. We only have a single virtual machine setup and doing some benchmarks. If we run hdparm -t /dev/sda1 from the virtual machine, we get 'ok' performance of 75MB/s from the virtual machine to the SAN. Then we basically compile a package with ./configure and make. Things start ok, but then all the sudden the load average on the SAN grows to 7+ and things slow down to a crawl. When we SSH into the SAN and run top, sure the load is 7+, but the CPU usage is basically nothing, also the server has 1.5GB of memory available. When we kill the compile on the virtual machine, slowly the LOAD on the SAN goes back to sub 1 figures. What in the world is causing this? How can we diagnosis this further? Here are two screenshot from the SAN during high load. 1> Output of iotop on the SAN: 2> Output of top on the SAN:

    Read the article

  • Open Source PDF reader for windows as an alternative to Adobe reader

    - by Tom Feiner
    With the latest javascript vulnerabilities in Adobe reader and bloat it has aquired over the years, I've been thinking of moving the network I'm in charge of to a different product for PDF reading on Windows. The ideal PDF reader should be something that is: Small in size (Adobe reader is more than 200MB these days after installation). As secure by default as possible (For example, javascript disabled by default). Nice looking and easy to use interface. Not bloated with features (I just want to read PDFs, that's it). Does not install any toolbars/unwanted add ons/spyware. Does not display any ads while viewing PDFs. Preferably Open Source. (this pretty much ensures no ads). Full Unicode support. Idealy , something like evince from gnome, will be the best option, but unfortunately that's not available on Windows. Foxit is an option, as it is small, and has a nice interface. But it still has javascript enabled by default which might lead to vulnerabilities - and it installs a toolbar , and displays ads while reading PDFs which is distracting. There is a site dedicated to Open Source PDF readers, pdfreaders.org, however, the Windows pdf readers each have their problems, mostly the interface is not as convenient (as evince, adobe or foxit). Here's a list of all PDF software from WikiPedia. There's a "Viewers" section for each OS. What Windows PDF reader would you recommend ?

    Read the article

  • Intel Core i7 QuadCore on HP Pavilion dv7 Overheating Issues

    - by kellax
    I bought a brand new HP notebook: HP Pavilion dv7-6b21em BeatsAudio edition. The notebook is about 2 months old and has pretty nasty overheating problem. I mainly use it for development however i do play some games. The disturbing thing is that the computer is loud on pretty simple tasks. Here are the specs: CPU: Intel Core i7-2670QM QuadCore ( 8 threads ) @ 2.20 GHz RAM: ( 8GB ) 2x 4GB @ 1066 HDD: 1TB 7200 GPU: ATI Radeon HD 6770M 1GB Dedicated DDR OS: Windows 7 64bit Enterprise I have an external monitor runing on VGA port an 22' Samsung SyncMaster S24B300 CPU Heat Statistics Platform: rPGA 988B (Socket G2) Frequency: cca. 3000 Mhz VID: 1.1809 - 1.2059 v Revision: D2 CPUID: 0x206A7 TDP: 45.0 Wats, Lithographu: 32 nm Heat: Tj. Max: 100*C, Power 4.5 - 5.9 Wats Core #0: 63*C Load on all is about 0 to 2% Core #1: 65*C Core #2: 66*C Core #3: 67*C I opened the notebook the fan is working fine there is no dust but still right now the fan is pretty loud even tho all i have open is FireFox. When i run a game the heat jumps to whopping 90-97*C. It has not shut down due to overheating yet but the loud fan is pretty annoying considering I'm not really doing anything stressfull. Is there anything i can do to fix this is it maybe a BIOS issue ? I have all drivers updated tho to the latest. I have very few background processes running consuming bare 2GB of RAM and about 2% of CPU. I had it serviced they said there is nothing wrong with it. But i feel that a Notebook that costs 1.2k Euros cant be like this.

    Read the article

  • Does any Certificate Authority support both SAN and wildcards?

    - by nicholas a. evans
    My basic quandry is that wildcard certificates don't support subdomains of subdomains, nor do they help with alternate domain names. Basically, if my CN is example.com, I want a Subject Alternative Name field that looks roughly like so: DNS:example.com DNS*.example.com DNS:*.beta.example.com DNS:example.net DNS:*.example.net DNS:*.beta.example.net Using a self-signed cert, I verified that the browsers will work just fine with this. Unfortunately, none of the Certificate Authorities that I looked into (Thawte, GoDaddy, Verisign, Digicert) seemed to support both wildcard certs and Subject Alternative Name (sometimes referred to as "Multiple Domain UCC"). I even called up GoDaddy tech support to confirm. Is there a CA (trusted by 99% of browsers) that supports wildcards for the Subject Alternative Name? One little restriction: I'm saddled with Amazon EC2's single Elastic IP per instance limitation. Here are what I see as my backup plans: set up three extra EC2 instances, each configured for a different IP address and cert, and nginx reverse proxy from three of them into the app server(s) introduces latency(?), and even the cheapest EC2 instance isn't that cheap instead of dedicated reverse proxy instances, setup the four or more almost identical EC2 app servers, with nginx using the port to determine which cert to deliver, and use haproxy to distribute the traffic amongst themselves. complicated to configure and manage? I'm not using the cheapest EC2 instance type for my app servers. If I don't need 4+ app servers for the load, it raises the cost. set up an external server (outside of EC2) that doesn't have EC2's Elastic IP address restrictions, setup all of the alternate IP addresses and certificates on that server, and nginx reverse proxy from that server into the EC2 app servers. extra IP addresses are almost free (still need to pay for the server of course), but don't come with the robust "elasticity" that Amazon's Elastic IPs provide. even more latency than in the first scenario. Are these approaches crazy or reasonable? Do you have another one to suggest?

    Read the article

< Previous Page | 60 61 62 63 64 65 66 67 68 69 70 71  | Next Page >