Search Results

Search found 19131 results on 766 pages for 'power options'.

Page 494/766 | < Previous Page | 490 491 492 493 494 495 496 497 498 499 500 501  | Next Page >

  • Shared storage for web cluster

    - by user52475
    Hi all! Have a big question about shared/clustered/distributed file system for storage. It will shared storage for shared web hosting (web files + maildir) and OpenVZ containers storage . Have any one working example of such system? The options are: Lustre GFS1/GFS2 - GFS2 - as I understand is EXPERIMENTAL... NFS This 3 systems which I consider for shared storage. Now I have storage with HW RAID 10 - 1TB. NFS - As I know there will be problem with locking? GFS/Lustre - problems when there will be a lot of small files , what is typical for hosting environment and problems with maildir.

    Read the article

  • Experimenting with other search engines

    - by Bill Graziano
    I’ve been a Google user so long I can hardly remember what I used before it.  Alta Vista maybe?  Or Yahoo.  I’ve tried Bing off and on but it never really stuck.  I probably care more about search engines than your average user because of their impact on SQLTeam.com.  Lately I’ve been trying two other search engines and actually switched to one of them. I’ve played with Blekko a little in the past.  They have some interesting ways to “slice up” your results.  For example, searching on “SQL Server /blogs /date” should just search all the recently updated blogs.  Those two extra words on the search are slashtags.  The full list of slashtags runs from /forums to just see forums to /twitter to /nikon to /reviews and on and on and on.  I laughed when I saw they had slashtags for both liberal and conservative.  I’d hate to find any search results that don’t match my existing worldview :)  You can also create your own slashtags.  I created a mini-search engine for the SQL Server blogs that I read.  You can search it for “backup” at http://blekko.com/ws/backup+/billgraziano/sql-sites.  I uploaded my OPML and it limited the search to just those sites.  It seems like the site is focusing more on curating results and less on algorithms.  This is an interesting site for those power searchers.  There are some great ways to curate results using slashtags.  For 99% of my searches (type words, click on one of the first few links) slashtags are overkill.  They do have some good information on page and site ranking though so I’ll probably send some time looking through that. Blekko recently got my attention again when they said they were banning “content farms” - and that includes eHow and experts-exchange.  I always feel used when I click on a link to EE and find myself scrolling all the way to the bottom to see if I can find the answer.  Sometimes it’s there but sometimes it tells me I need to pay first.  I’ve longed for a way to always exclude certain sites.  Blekko might be taking a hammer to a problem that needs a scalpel but it’s an interesting choice.  (And some of the comments in the TechCrunch link are interesting if you’re a search nerd.) DuckDuckGo is an odd name for a search engine.  Their big hook is that they don’t have search history.  If you wade through your Google account you can probably find the page where it stores your search history.  It was pretty enlightening to find mine.  It was easy to disable but that got me started looking at other search engines.  DDG (or DukGo) just feels like Google used to in the old days.  The results are good enough and the site is fast. Searches will return a snippet from WikiPedia or other site (like StackOverflow) at the top.  I think the idea is to answer the question without needing to visit the site.  I’m not sure that’s a good thing for SQLTeam.com. The only thing I really miss is image search.  You can add a “!i” at the end of any search and it will search the images on Bing.  Bing doesn’t have a great image search but it works for most of what I need.  They call these exclamation marks “!bangs” and they are kinda, sorta like slashtags.  I’ve been using DuckDuckGo now for a few weeks and I’m pretty happy with it.  I use Chrome for my browser and it was an easy switch to make.  It’s still a little surprising seeing my search results come up in a different format.  I’m starting to get used to it though.

    Read the article

  • Datacenter Backup Strategy

    - by EasyEcho
    What are common approaches to backup solutions in remote data centers? I am already familiar with general backup principals and have a very good backup strategy for our local data center but am having great difficulty extending it to a remote data center. We currently do a full backup on Friday, differential Mon - Thu, rotate offsite Friday morning ...rinse and repeat week after week. BTW, we use disks and have been very happy with this approach. We could buy a large storage server and backup everything to it, but this solution doesn't give you offsite. We could encrypt and upload to Amazon or some other online storage but that would take a large amount of time given the data and would be rather expensive paying for the bandwidth leaving the data center and receiving at amazon. We could drive to the data center every Friday and continue to rotate disks as we do now. But that just seems old fashion. What am I missing, are there better options?

    Read the article

  • Free hosting solution for a very low-traffic website [duplicate]

    - by user966939
    This question already has an answer here: How to find web hosting that meets my requirements? 4 answers I run a very low-traffic website (about 40 users, basically all of which are daily active on the site). I don't see it changing anytime soon either, as there is no way to sign up on the site right now. Until now I have just been using a sub-directory on a friend's host (shared), to host the web site. But in only a few weeks from now, his subscription will end, and he has no plans on renewing it. So of course this means I'll have to move on to something else. But I don't think I'll find someone who'd be willing to share a... shared host with me again. And besides, the software used on that server is ancient (PHP 4.4.9 + MySQL 4.1.22). There's one obvious solution that comes to mind, I guess: choose a better host and pay for it myself. The problem here is that I have no real fixed income, as I'm only a student. So even if the pricing is dirt cheap, I just can't be certain I will be able to afford it, every single month, for... at least 2 years maybe? So I've looked at free hosting solutions instead. The least requirement I had was that it was completely free of ads. But no matter where I look, I always find something in a corner or two ("what can you expect from a free host?" - yeah I know, but I guess it was worth a shot). For example, on Byethost (one of the free hosts I tried), if you trigger a PHP error while error reporting is set to E_ALL, you will spawn some hidden ad... Besides Byethost, I've tried 000Webhost, x10Hosting, 2Freehosting/1Freehosting, Wink.ws, and they are only worse. Okay, I'm running low on ideas. But! What if I just hosted the site myself, on my own computer? That could work. I actually do have my computer on practically 24/7. But not really. Sometimes I need to reboot it, and sometimes we even have power outages. And what if the hardware needs an upgrade? It's not such a big deal for me if the site went down, because I know what's going on; but what about the users? If I do decide to host it myself, is there some way to show users an alternate page instead of them just seeing a generic "server not found" page in the browser when the site is not accessible? Or is there something I have been missing out on? Is there a different kind of "web hosting" solution out there that I haven't heard of? Here is what I'm really looking for: Free (as in, no costs) NO ads Bandwidth enough for a low-traffic forum with roughly 40 users (Semi-)Up-to-date PHP and MySQL (at least not older than a year) No standard (non-extension) PHP functions turned off - such as sleep() The mbstring extension is enabled Disk space: at least 5 MB At least one MySQL database Some bonus points would be: Max execution time of PHP scripts can be set Remote access to MySQL database What would be the best solution for me? Is there one?

    Read the article

  • How do I mount a CIFS share via FSTAB and give full RW to Guest

    - by Kendor
    I want to create a Public folder that has full RW access. The problem with my configuration is that Windows users have no issues as guests (they can RW and Delete), my Ubuntu client can't do the same. We can only write and read, but not create or delete. Here is the my smb.conf from my server: [global] workgroup = WORKGROUP netbios name = FILESERVER server string = TurnKey FileServer os level = 20 security = user map to guest = Bad Password passdb backend = tdbsam null passwords = yes admin users = root encrypt passwords = true obey pam restrictions = yes pam password change = yes unix password sync = yes passwd program = /usr/bin/passwd %u passwd chat = *Enter\snew\s*\spassword:* %n\n *Retype\snew\s*\spassword:* %n\n *password\supdated\ssuccessfully* . add user script = /usr/sbin/useradd -m '%u' -g users -G users delete user script = /usr/sbin/userdel -r '%u' add group script = /usr/sbin/groupadd '%g' delete group script = /usr/sbin/groupdel '%g' add user to group script = /usr/sbin/usermod -G '%g' '%u' guest account = nobody syslog = 0 log file = /var/log/samba/samba.log max log size = 1000 wins support = yes dns proxy = no socket options = TCP_NODELAY panic action = /usr/share/samba/panic-action %d [homes] comment = Home Directory browseable = no read only = no valid users = %S [storage] create mask = 0777 directory mask = 0777 browseable = yes comment = Public Share writeable = yes public = yes path = /srv/storage The following FSTAB entry doesn't yield full R/W access to the share. //192.168.0.5/storage /media/myname/TK-Public/ cifs rw 0 0 This doesn't work either //192.168.0.5/storage /media/myname/TK-Public/ cifs rw,guest,iocharset=utf8,file_mode=0777,dir_mode=0777,noperm 0 0 Using the following location in Nemo/Nautilus w/o the Share being mounted does work: smb://192.168.0.5/storage/ Extra info. I just noticed that if I copy a file to the share after mounting, my Ubuntu client immediately make "nobody" be the owner, and the group "no group" has read and write, with everyone else as read-only. What am I doing wrong?

    Read the article

  • SharePoint MOSS - Serve HTTP content on an HTTPS page without Mixed Content Warning?

    - by kcb263
    Our "portal-like" SharePoint site is served using HTTPS/SSL. So a user goes to https://web.company.com and sees content and different Web Parts. So far, no problem. The desire now is to have new Web Parts added that either frame HTTP content (such as Weather Bug) or HTTP RSS feeds. The issue that arises is that by doing this, results in a "Mixed Content" warning in the browser. Has anybody successfully been able to implement such a scenario, or one similar to it? The options we have looked at, unsuccessfully, have been: using Apache Reverse Proxy Server mirror an external site Custom Web Parts

    Read the article

  • Oracle Open World / Public Sector / Identity Platform

    - by user12604761
    For those attending Oracle Open World (Oct. 1st - 3rd, 2012 at the Moscone Center in San Francisco), the following details are recommended:  OOW Focus on Public Sector. Also, Oracle's foundational Identity and Access Management and Database Security products that support government security ICAM solutions are covered extensively during the event, the following will be available: The focus is on Oracle's Modern Identity Management Platform.   Integrated Identity Governance Mobile Access Management Complete Access Management Low Risk Upgrades The options for attendees include 18 sessions for Identity and Access Management, 9 Identity and Access Management demonstration topics at the Identity Management Demo Grounds, and 2 hands on labs, as well as 21 database security sessions. Oracle Public Sector Reception at OOW:  Join Oracle's Public Sector team on Monday, October 1 for a night of food and sports in a casual setting at Jillian’s, adjacent to Moscone Center on Fourth Street. In addition to meeting the Public Sector team, you can enjoy Monday Night Football on several big screen TVs in a fun sports atmosphere. When: Monday, October 1, 6:30 p.m.–9:30 p.m. Where: Jillian's, 101 Fourth Street, San Francisco 

    Read the article

  • How to improve my backup strategy (rsync)?

    - by GUI Junkie
    I've seen the QAs about backup solutions, but I'm asking anyway. One because it's a personal situation I haven't solved yet, and two, because the answer can be useful for others. My situation is rather simple. I have two computers with two users and one external hard-drive. I want to sync/backup a shared directory. Currently I use rsync with the -azvu options to sync to the external drive. My problem is the round-trip. All deleted files are restored! Using rsync I'm doing Computer A --> External disk --> Computer A Computer B --> External disk --> Computer B (I should probably do External disk -- Computer A as a last step) I've seen 'bup' mentioned and other QA talk about dropbox + rsync... Another option is maybe to delete files from rsync? Can my running backup strategy be improved in some other way?

    Read the article

  • When to use delaycompress option in logrotate?

    - by Anand Chitipothu
    The man page of logrotate says that: It can be used when some program cannot be told to close its logfile and thus might continue writing to the previous log file for some time. I'm confused by this. If a program cannot be told to close its logfile, it will continue to write forever, not for sometime. If the compression is postponed to next rotation cycle, the program continues to write to that file even after the next rotation cycle. How is postponing solving the problem? My understanding is that copytruncate should be used when a program cannot be told to close the logfile. I'm aware that some data written to the logfile gets lost when the copy is in progress. I was looking at the logrotate file for couchdb, and it had both copytruncate and delaycompress options. /usr/local/couchdb-1.0.1/var/log/couchdb/*.log { weekly rotate 10 copytruncate delaycompress compress notifempty missingok } It looks like there is no point using delaycompress when copytruncate is already there. What am I missing?

    Read the article

  • Ubuntu 12.04 slow boot on ASUS, attached with dmesg and bootchart

    - by stanleyhunk
    I heard that Ubuntu can boot up around 30sec, but I take more than 60sec every time my Ubuntu boot. I also read some forum said need to post the dmesg and bootchart to identify which process slowing down the booting time, as I'm not expert in Ubuntu and wish to learn more about it, I humbly ask any pro here to teach me how. My laptop specs: Model : ASUS K45VS RAM : 8MB CPU : Intel(R) Core(TM) i7-3630QM CPU @ 2.40GHz x 8 Graphic Card : nVidia GeForce GT 645M HDD : 750GB OS : Single boot Ubuntu 12.04LTS System.uname : Linux 3.8.0-39-generic #58~precise1-Ubuntu SMP Fri May 2 21:33:40 UTC 2014 x86_64 System.release : Ubuntu 12.04.4 LTS System.kernel.options : BOOT_IMAGE=/boot/vmlinuz-3.8.0-39-generic root=UUID=c8a71503-bce8-406c-9a5f-5aa8284f5c7c ro quiet splash My dmesg (which highlighted to the huge time frame gap): [ 30.772656] cgroup: libvirtd (1961) created nested cgroup for controller "memory" which has incomplete hierarchy support. Nested cgroups may change behavior in the future. [ 30.772659] cgroup: "memory" requires setting use_hierarchy to 1 on the root. [ 30.772683] cgroup: libvirtd (1961) created nested cgroup for controller "devices" which has incomplete hierarchy support. Nested cgroups may change behavior in the future. [ 30.772710] cgroup: libvirtd (1961) created nested cgroup for controller "blkio" which has incomplete hierarchy support. Nested cgroups may change behavior in the future. [ 32.140335] nvidia 0000:01:00.0: irq 46 for MSI/MSI-X [ 32.505619] ACPI Error: Field [TMPB] at 1081344 exceeds Buffer [ROM1] size 262144 (bits) (20121018/dsopcode-236) [ 32.505624] ACPI Error: Method parse/execution failed [\_SB_.PCI0.PEG0.PEGP._ROM] (Node ffff880224e91f00), AE_AML_BUFFER_LIMIT (20121018/psparse-537) [ 802.034422] audit_printk_skb: 69 callbacks suppressed [ 802.034428] type=1400 audit(1400914804.392:35): apparmor="DENIED" operation="capable" parent=1 profile="/usr/sbin/cupsd" pid=1683 comm="cupsd" pid=1683 comm="cupsd" capability=36 capname="block_suspend" [ 1581.300901] type=1400 audit(1400915584.816:36): apparmor="DENIED" operation="capable" parent=1 profile="/usr/sbin/cupsd" pid=1683 comm="cupsd" pid=1683 comm="cupsd" capability=36 capname="block_suspend" My Bootchart.png: Looking forward to learn to improve both my booting time and knowledge. Thanks in advance :)

    Read the article

  • Exchange unable to relay mail outbound

    - by Saif Khan
    I have some users congigured in exchange 2003 (delivery options tab) to forward mails to their external address. This was working fine until today. The mails are being held up in the SMTP queue folder. I am able telnet the addresses (e.g. google.com) at port 25 from the server. Any reason why the mails are held up? Other emails are going ou. It's only the mailboxes configured to forward the mail out to the public email addresses. I also did the following Check event logs for errors. Nothing. Checked my domain on blacklists. Nothing. Any idesa?

    Read the article

  • Grandma's Computer - Can a user that belongs only to the "Users" group in Windows XP install malware, virus or IE addons?

    - by DanC
    I am trying to figure out if having a user in the "Users" group will be enough to prevent her from install unwanted software. The things that I don't want the user to be able to install are: virus malware bandoo stuff Internet Explorer Addons To put you in context, I am thinking of my grandma's computer, I want her to be able to read all her email stuff and attachments, but without the hassle of needing to reinstall the whole computer every few months. The computer will run Windows XP, with some free antivirus. It will not be part of any domain. It is just a home computer. Linux, I have tried making her use it, but she was already accustomed to Windows and was not really an option to have her re-learn where was the shutdown button. So, are these considerations enough to prevent her installing unwanted software? What other options come to you mind? Thanks

    Read the article

  • Terrible Performance with SATA Drives on Dell PowerEdge, steps to troubleshoot?

    - by Tom
    I had asked this question earlier and the question went missing so here it is again. Bought a DELL Poweredge 2950 to use as in-house QA Server. Disk performance is beyond terrible, 1000-4000 ms response time on the drive with our SQL Server database .mdf. Sql Server disk queue upwards of 300 at times. I'm a software guy, can anyone help me with steps to determine the issue? I don't know what RAID controller it has, how can I determine that? I'm speculating it could be BIOS issue. Perhaps the server used to have another kind of drive in it and when I added SATA the ??? buffer size is wrong??? Perhaps I chose wrong options (chose defaults) when setting up the RAID 1 arrays? I thought RAID 1 was a performance array?

    Read the article

  • VisualSVN Server , Windows 7 and Apache Problem

    - by Ash
    I am running Visual SVN Server(with Apache) on a Windows 7 computer and network. After about 15-20 minutes of my first commit/update, I am unable to access the repository via Tortoise SVN. The error message I get is: OPTIONS of "https://jason/svn/repository1": could not connect to server (https://jason) Restarting the Visual SVN Server service helps sometimes but fails quite often. The only sure-shot way to get it working is to restart the computer. The server - https://jason is also not accessible via the browser when I get this error 1) I tried reinstalling Windows 7, Visual SVN server and Tortoise SVN but I still keep getting this error. 2) I searched several forums but I dont seem to be able to find an answer. Please help.

    Read the article

  • rsync to cifs mount but preserve permissions

    - by weberwithoneb
    I'm backing up a linux server to a windows share. I'm currently mounting the windows share with cifs and using rsync for incremental backups. File permissions and ownership are not being preserved, as should be expected after reading this samba document: The core CIFS protocol does not provide unix ownership information or mode for files and directories. Because of this, files and directories will generally appear to be owned by whatever values the uid= or gid= options are set, and will have permissions set to the default file_mode and dir_mode for the mount. How can I achieve my goal of preserving unix file permissions while writing to a windows share? Is there another network file system that would allow me to do this? Thanks.

    Read the article

  • What free OS should I use on my VPS?

    - by earlz
    Hello, I looked a bit but didn't see any duplicate of this so my question is which free(open source) OS do you use on servers and why do you use that OS? Background I have a VPS at Linode. There is a broad range of options for which OS I can put on it including both 32 and 64bit OSs. I just use it to run my small blog and for hosting random files. It's very low traffic. I have been using 64bit Arch Linux on my VPS and though I love the OS for general usage, for a server the constant breakage is troublesome. So I'm considering trying something new and am looking for suggestions.

    Read the article

  • Solving &ldquo;XmlSchemaException: The global element '&lt;elementName&gt;' has already been declare

    - by ChrisD
    I recently encountered this error when I attempted to consume a new hosted WCF service.  The service used the Request/Response model and had been properly decorated.  The response and request objects were marked as DataContracts and had a specified namespace.   My WCF service interface was marked as a ServiceContract and shared the namespace attribute value.   Everything should have been fine, right? [ServiceContract(Namespace = "http://schemas.myclient.com/09/12")] public interface IProductActivationService { [OperationContract] ActivateSoftwareResponse ActivateSoftware(ActivateSoftwareRequest request); } well, not exactly.  Apparently the WSDL generator was having an issue: System.Xml.Schema.XmlSchemaException: The global element 'http://schemas.myclient.com/09/12:ActivateSoftwareResponse' has already been declared. After digging I’ve found the problem; the WSDL generator has some reserved suffixes for its entities, including Response, Request, Solicit (see http://msdn.microsoft.com/en-us/library/ms731045.aspx).  The error message is actually the result of a naming conflict.  The WSDL generator uses the namespace of the service to build its reserved types.  The service contract and data contract share a namespace, which coupled with the response/request name suffixes I was using in my class names, resulted in the SchemaException. The Fix: Two options: Rename my data contract entities to use a non-reserved keyword suffix (i.e.  change ActivateSoftwareResponse to ActivateSoftwareResp). or; Change the namespace of the data contracts to differ from the service contract namespace. I chose option 2 and changed all my data contracts to use a “http://schemas.myclient.com/09/12/data” namespace value. This avoided a name collision and I was able to produce my WSDL and consume my service.

    Read the article

  • Ubuntu 12.04 host lookups extremely slow

    - by tubaguy50035
    I'm having issues with one of my servers taking a long time to look up host names. This is an Ubuntu 12.04 box, so I've tried following the new resolvconf directives. In my /etc/network/interfaces file, I defined my name servers like this: auto eth0 iface eth0 inet static address someaddress netmask 255.255.255.0 gateway 198.58.103.1 dns-nameservers 74.14.179.5 72.14.188.5 In my /etc/resolv.conf, I see these name servers, like this: # Dynamic resolv.conf(5) file for glibc resolver(3) generated by resolvconf(8) # DO NOT EDIT THIS FILE BY HAND -- YOUR CHANGES WILL BE OVERWRITTEN nameserver 74.14.179.5 nameserver 72.14.188.5 On another box, I edited the resolv.conf directly as directed by my hosts' setup help files. It looks like this: domain members.linode.com search members.linode.com nameserver 72.14.179.5 nameserver 72.14.188.5 options rotate This second box has no issues with host name look ups and responds quite quickly. Could not having the domain and search directives make my look ups slow? By slow, I mean it's taking anywhere from 5 to 15 seconds to find the IP address of a host.

    Read the article

  • How can I connect my USB (HP) printer in 10.4, which can't be discovered and worked in 9.x

    - by Brian
    My printer was working under 9.x. It is a an hp photosmart C3100 series. When I open the Admin- printing. no local printers are found. I try to add via other (My local choices are Serial and other). I have tried many uri's - ipp://localhost:631/ipp, http://localhost/ipp, localhost, 127.0.0.1, etc... None have worked. Under the networked I have tried JetDirect, using localhost and 127.0.0.1 and port 631. I have tried many options under IPP with different variants in the host trying to verify a printer. No luck. I tried LPD/LPR with localhost and tried the probe. no luck. I tried the cups admin via localhost:631 and that didn't work. On the old version its simply found the local printer, I might have picked the driver, I can't remember but it was the photosmart c3100 series that was working. I just can't get 10.4 to print.

    Read the article

  • Why does Unity not extend to my 2nd monitor, even when it is displaying an X-Screen?

    - by Gridwalker
    I recently added a 2nd video card to my system, but unity refuses to extend my desktop over to the second screen. Although the secondary monitor initialises when I boot and I can move the mouse cursor over to the 2nd screen, the screen is otherwise blank (showing no wallpaper or interface elements) and I am unable to move any windows to this monitor. Moving the mouse cursor over to the 2nd monitor changes it from the default cursor to the old-style X cursor, such as the one that appears when you run X-kill, indicating that this screen is initialised in the X Server but that Unity is not recognising it. Although the Nvidia X Server Settings application can see both monitors, the unity systems settings application does not detect the 2nd adapter. Sometimes the additional drivers application can see both adapters, but it doesn't consistently show options for them both. Xrandr also fails to detect the 2nd monitor, but iNex lists both adapters. I have experimented with several different drivers for each adapter and with setting each of the graphics cards as the primary adapter in the BIOS, but this has made little difference. The two adapters are an onboard Geforce 8200 and a PCIE Geforce 7200 GX. The onboard adapter is currently set as the primary, however this adapter crashes whenever I use the Nouveau driver and I have to switch over to the PCIE as a primary whenever I purge the proprietary drivers (switching back when the 304 driver has been reinstalled). It doesn't matter which adapter I set as my primary, the results are the same : one screen showing the unity interface and one screen showing an X-Screen that only displays the mouse cursor. All I want is to be able to run this system in a dual screen configuration. I am not a gamer, nor do I require 3D rendering capabilities. Anything you can suggest to get the desktop to extend across both screens will be massively appreciated!

    Read the article

  • How to dual OS 32-bit/64-bit Windows 7 Ultimate.

    - by Cyril Horad
    I have a problem with regards to my nVidia driver not running on 64-bit. I decided to install both 32-bit and 64-bit on my ASUS K42JC (4GB RAM upgrade) in order to function the nVidia on the 32-bit. My question is, how could I make my laptop run on either 32-bit or 64-bit OS. What options I am suppose to use, a single, double, or triple partition? From an answer: Well. When I installed the nVidia driver from either the ASUS site and the prescribed driver from NVIDA site via System Requirements Lab, both ended up freezing my laptop to the point when the desktop is about the finish booting. I have tried three(3) times reformatting and trying to fix the problem. Yet no use. I filed a ticket to the Asus support but for now no replies yet. But this bothers me, why wouldn't the nVidia run on 64bit yet it runs perfectly on 32bit.

    Read the article

  • Looking for a router-like web interface for my Debian gateway.

    - by marcusw
    Hey, I need a web interface program for my debian gateway which has the features of a router's one. Specifically, I must be able to easily Forward ports to various clients on the LAN or the router itself (it's also a server) Manage a DHCP server preferably including DHCP reservation for certain MACs Give me a list of the connected DHCP clients (optionally) Show which clients are the most active as far as bandwidth (something like iftop) Alternatively, it could be a graphical app which I could tunnel over ssh. No command line programs please...I'm used to doing this stuff with a point-and-click interface. Not adverse to command-line setup; just need to be able to reconfigure things graphically. Have a working LAMP setup. I've tried webmin, but it didn't satisfy the "easy" part...too many clicks and too many meny options.

    Read the article

  • Windows XP Blue-Screens And Reboots Instantly On Boot-Up

    - by ashes999
    My computer was running perfectly until yesterday; my internet suddenly died. Being wireless, I decided to reboot; now the computer doesn't boot. It shows the "Starting Windows" screen (safe mode loads several drivers), then it suddenly blue-screens and immediately reboots (I cannot read the blue-screen contents at all). I've tried safe-mode (with and without networking); no dice. I booted another HD with another XP install and ran scandisk on my HDs; no bad sectors, and even with auto-repair, it didn't fix the problem. Now what? What are my options to fix the broken XP install? I'm thinking: 1) Remove wireless card 2) Run XP install CD and repair installation 3) ???

    Read the article

  • how to properly Install chromium from zip and make it the default browser

    - by ClarifyLinux
    Since the Chromium PPA is no longer maintained, for those of us preferring to use chromium over chrome, we have two options: Build and Install from Source Download either 'beta' or daily builds (in a zip file) Unfortunately for me, option 1 is overly complicated. I know how to compile most any other applications in Ubuntu but I've never been able to get chromium to build correctly. I am currently using option 2. In Chromium I have the Chromium Updater installed (http://goo.gl/ffAMy). This gives me quick access to the most recent 64bit versions. Once downloaded, I install to /home/myuser/opt/chrome-linux. From this directory I can run the chrome binary. It works perfectly except for the fact that I cannot get it to act as my default browser. I've tried, as root, installing the binary in /opt/chrome-linux/ with a symbolic link to the 'chrome' binary in /usr/bin. Unfortunately, this doesn't work as a non root user. So my question is - How do I properly install a downloaded chromium zip build so tht it's listed as an option for the default browser?

    Read the article

  • Handling Types for Real and Complex Matrices in a BLAS Wrapper

    - by mga
    I come from a C background and I'm now learning OOP with C++. As an exercise (so please don't just say "this already exists"), I want to implement a wrapper for BLAS that will let the user write matrix algebra in an intuitive way (e.g. similar to MATLAB) e.g.: A = B*C*D.Inverse() + E.Transpose(); My problem is how to go about dealing with real (R) and complex (C) matrices, because of C++'s "curse" of letting you do the same thing in N different ways. I do have a clear idea of what it should look like to the user: s/he should be able to define the two separately, but operations would return a type depending on the types of the operands (R*R = R, C*C = C, R*C = C*R = C). Additionally R can be cast into C and vice versa (just by setting the imaginary parts to 0). I have considered the following options: As a real number is a special case of a complex number, inherit CMatrix from RMatrix. I quickly dismissed this as the two would have to return different types for the same getter function. Inherit RMatrix and CMatrix from Matrix. However, I can't really think of any common code that would go into Matrix (because of the different return types). Templates. Declare Matrix<T> and declare the getter function as T Get(int i, int j), and operator functions as Matrix *(Matrix RHS). Then specialize Matrix<double> and Matrix<complex>, and overload the functions. Then I couldn't really see what I would gain with templates, so why not just define RMatrix and CMatrix separately from each other, and then overload functions as necessary? Although this last option makes sense to me, there's an annoying voice inside my head saying this is not elegant, because the two are clearly related. Perhaps I'm missing an appropriate design pattern? So I guess what I'm looking for is either absolution for doing this, or advice on how to do better.

    Read the article

< Previous Page | 490 491 492 493 494 495 496 497 498 499 500 501  | Next Page >