Search Results

Search found 34971 results on 1399 pages for 'st even'.

Page 216/1399 | < Previous Page | 212 213 214 215 216 217 218 219 220 221 222 223  | Next Page >

  • Installing Silverstripe on 000webhost.com (free web host)

    - by benwad
    Hi I'm trying to learn how to work Silverstripe so I extracted the tar file to my free hosting account. I then went on install.php and edited the permissions to meet the requirements set out in install.php but I still get two warnings from the 'webserver configuration' section: I can't tell what webserver you are running. Without Apache I can't tell if mod_rewrite is enabled. I can't tell whether mod_rewrite is running. You may need to configure a rewriting rule yourself. I looked in phpinfo() and mod_rewrite appears to be installed. I contacted the web host and they said it was to do with virtual directory paths, and I should add 'RewriteBase /' to the top of my .htaccess file in the public_html directory. However I did this and still had the same problem. The install.php script says that I can install it even with these warnings but when I press 'install' it just refreshes the install.php page. It doesn't even overwrite the .htaccess file. 000webhost.com says they have successfully installed Silverstripe on their user accounts without much configuration but I can't seem to find out how. EDIT: I managed to get to the next page but now there is another warning which is stopping it installing: Friendly URLs are not working. This is most likely because mod_rewrite isn't configuredcorrectly on your site. Please check the following things in your Apache configuration; you may need to get your web host or server administrator to do this for you: * mod_rewrite is enabled * AllowOverride All is set for your directory I also get this error message from the server: Warning: unlink(mysite/_config.php) [function.unlink]: Permission denied in /home/a2716553/public_html/install.php on line 701

    Read the article

  • ASUS M51SE freezes for no apparent reason

    - by Piotr Justyna
    First of all, it's my first question on superuser so please excuse me if it doesn't belong here, a similar one has already been posted here or if I missed some details. My ASUS M51 freezes up. It all started a couple of months ago and I basically forgot about it since I bought a new laptop around that time. This is, however, bugging me since then and I can't explain why it's happening. Let me quickly describe what's wrong. When switched on and running (win 7) it freezes after a couple of minutes of normal usage (or even if I don't actually do anything). By 'freezes', I mean it's like a static image of my desktop was being displayed on the screen. Nothing happens, alt+ctrl+del doesn't help, I basically have to switch it off using a power button. I tried to remove the hard drive and to start the laptop without it. The same here - it freezes on the the initial black loading screen (a couple of minutes after the computer says it can't find the hdd) I tried to remove RAM - the same thing. All fans are spinning as they should. I cleaned the fans using a small paintbrush but it doesn't change anything. The laptop is generally clean and in pretty good physical shape. Well, almost, obviously :). One possible clue I can think of is that the laptop is heating excessively even when it doesn't actually do anything (hdd removed). Do you have any ideas what is the cause of this or what else can I try? Thanks, Piotr

    Read the article

  • Hyper-V 2008 R2 synthetic networking stops working with linux 2.6.32.15

    - by luxifer
    Hi there, so I thought I'd give Hyper-V on Windows Server 2008 R2 Enterprise a try on my Homeserver (yes, it's legit... got it from msdnaa). First thing to throw at it was my firewall which runs IPFire. This distribution currently uses the kernel version 2.6.32.15 and comes with the Hyper-V drivers. So I enabled them and at first they work just fine but after a few minutes they just fail. There are no packages going in or out anymore until I reboot the VM but sometimes even that won't work so the VM just keeps "Stopping" like forever. Emulated networking works fine but it slow and uses more CPU. That way my firewall routes slower than when running under virtualbox on an atom N270. My server has an E6750; VM is limited to 25%, but that should still outperform this atom CPU especially since it's never going anywhere near 100% CPU load, so give me a break! A quick google search led me to people having the same problem (even with other distributions and kernel versions that include those drivers) but no solution yet... I already found this but I can't quite follow the author on the part where he solved the issue - especially since I need two virtual nics for my firewall distro to work (obviously one internal and one external) What am I missing here?

    Read the article

  • IIS7 error config and remote errors

    - by Kev
    Certain IIS7/7.5 500.19 configuration errors only render on a browser running on the local server. This appears to happen regardless of whether I set <httpErrors errorMode="Detailed" existingResponse="PassThrough" /> in the system.webServer section of a site's web.config file (or even globally for that matter). For example, I had a developer who reported that he was just getting the generic IIS7 500 error page: This was happening even though he had the following configured in his web.config: <configuration> <system.webServer> <httpErrors errorMode="Detailed" existingResponse="PassThrough" /> </system.webServer> </configuration> If I browse to the site on the server itself I see (some sensitive info redacted): Would the reason for this be that if the web.config has errors it therefore can't be parsed. Because it can't be parsed the local <httpErrors> setting doesn't get read and thus causes IIS to revert to default settings (i.e. DetailedLocalOnly)? Update: @LazyOne - suggested setting the above config at the server level which I already tried. This resulted in just raw 500 errors:

    Read the article

  • Alternatives to ntbackup

    - by Chris J
    Is anyone aware of any good (for a given value of "good") alternatives to ntbackup? There's a few quirks with ntbackup that I still find odd, and occasionally (and for no obvious reason) ntbackup just doesn't back up - usually complaining about that the wrong tape's inserted even though it's been told to just use the tape that's in the drive regardless. I've experiemented with cygwin's tar and cpio and Win32 ports of these utilities, but I've not had any luck getting them to see the tape device (so if someone does use these utilities to write to a tape device, also be interested to know how). Essentially all I'm looking for is a reliable program that I can tell to back-up a list of locations to the tape, and not to care about anything like formatting tapes, or sticking volume labels on them (and conversely then makes it fairly straightforward to restore off as well). On the flip-side, I don't want something that will attempt to manage our tapes. Don't get me wrong here -- ntbackup can do the job, but its quirks are making look at possible alternatives. Any suggestions? If it's open source, even better.

    Read the article

  • Powershell script to delete sub folders and files if creation date is >7 days but maintain parent folders of sub folders and files <7 days old

    - by Mark
    I'm currently using the Powershell script below to delete all files directories and sub directories of "$dump_path" that are seven days or older based upon the creation date and not modified date. The problem with this script is this: If folder "A" is seven (or more) days old it will be deleted even if its sub folders and files are less then seven days old. What I would like this script to do is this: Delete all files from the root and in all sub folders of "$dump_path" that are seven or more days old but maintain the parent folder(s) of files and folders that are less than seven days old even if that means the parent folders are more than seven days old. If all subfolders and files are seven days or older than the parent folder then the parent can be deleted. Slightly obscure problem I know, but the intention is to have a 7 day retention period of all data in a 'sandbox' location of our shared areas. Also, an added bonus if it could generate a log of what it deletes and e-mails it out post deletion. Thank you for reading and I hope that all makes sense! Mark # set folder path $dump_path = "c:\temp" # set minimum age of files and folders $max_days = "-7" # get the current date $curr_date = Get-Date # determine how far back we go based on current date $del_date = $curr_date.AddDays($max_days) # delete the files and folders Get-ChildItem $dump_path | Where-Object { $_.CreationTime -lt $del_date } | Remove-Item -Recurse

    Read the article

  • How do I configure nVidia drivers on a Portable Ubuntu setup?

    - by Nicholas Flynt
    I've been pulling my hair out over this one for a couple of days now, google is no help. I've created a wonderful (until this issue) portable copy of Ubuntu linux that will boot on mostly anything by using a USB enclosure for my laptop's 80GB SATA drive. So far so good, it boots and runs on everything, and on non-nVidia card setups was even detecting the drivers, or letting me install the required drivers for hardware acceleration and compiz. Because you know, the wobble windows are the most awesome thing ever. Anyway, my desktop machine had an nVidia card, so I'm thinking, sure, I'll just install the nVidia drivers like before and everything will work happily. Not so-- now the desktop and any other nVidia cards work great, but it seems to have completely disabled any other graphics cards. When the kernel module detects that an nVidia card isn't present, it shoots up this nasty little dialog box giving me the option to boot into "low graphics" mode, which doesn't even allow me to use the correct screen resolution, much less see the installed graphics card and try to configure a driver for it. Is there any way to configure Ubuntu (with the dreaded nVidia kernel module) so that it can use nVidia's drivers when an nVidia card is present, and default to the normal (not low-graphics) setup in other cases, so that it has a fair chance of using what's actually present? I'm not afraid to much with config files, I just don't know the underlying system well enough to feel comfortable diving in without a push in the right direction. Thanks guys!

    Read the article

  • Skydrive for desktop not showing icon in notification area

    - by jeruntime
    is this normal behavior? I'm concerned that some files will not be transferred. For example. I need to be sure that its safe to put my laptop to sleep so that later when I'm on my desktop I wont have to open up my laptop and sync again if I closed it too early. It doesn't even appear when I THINK it is syncing. Thanks for any help in advance. edit 1: Yeah I know the app isn't required or even offered anymore, but what I was wondering was if they had some way to tell if it's syncing or not. The old desktop app had an icon in the notification area which had a animation for when it was syncing, but now there isn't anything there. Although when I try to go in and make it appear it says the application isn't active and the icon will show the next time it is. Its version is 17.0.2015.0811 Is it possible to force the icon to show in the notification area if it isn't designed to show up there at all? I basically just want the same functionality of Skydrive on windows 8.

    Read the article

  • Openldap, groups, admin groups, etc

    - by Juan Diego
    We have a samba server as PDC with OpenLDAP. So far everything is working, even windows 7 can log on to the Domain. Here is the tricky part. We have many departments, each department has it's own IT guys, and these IT guy should be able to create users in their department and change any info of the users in their department. My Idea was to create 2 groups for each department, For example: Department1 and Admins Department1. Admins Deparment1 has "write" priviledges for members of group Department dn: ou=People,dc=mydomain,dc=com,dc=ec objectClass: top objectClass: organizationalUnit ou: People dn: cn=Admins,ou=Group,dc=mydomain,dc=com,dc=ec objectClass: groupOfNames objectClass: top cn: Admins dn: cn=Admins Department1,cn=Admins,ou=Group,dc=mydomain,dc=com,dc=ec objectClass: groupOfNames objectClass: top cn: Admins Department1 member: uid=jdc,ou=People,dc=mydomain,dc=com,dc=ec structuralObjectClass: groupOfNames I dont know if you should make Department1 as part of Domain Users dn: cn=Deparment1,cn=Domain Users,ou=Group,dc=mydomain,dc=com,dc=ec objectClass: groupOfNames objectClass: top cn: Deparment1 member: uid=user1,ou=People,dc=mydomain,dc=com,dc=ec Or just create the deparments like this. dn: cn=Deparment1,ou=Group,dc=mydomain,dc=com,dc=ec objectClass: groupOfNames objectClass: top cn: Deparment1 member: uid=user1,ou=People,dc=mydomain,dc=com,dc=ec I seems that when you use smbldap tools bydefault the users are part of Domain Users even if you dont have them as part of Domain Users in the memberUid attribute, when I use finger they showup as part of the Domain Users group. I dont want the Departments Admins to be Domain Admins because they have power over all the users, unless I am mistaken. I also have trouble with the ACLs. I was trying to create an acl for members of this Admins group, I was trying with this search, but didnt work ldapsearch -x "(&(objectClass=organizationalPerson)(member=cn=Admins Department1,ou=Group,dc=mydomain,dc=com,dc=ec))" I am open to suggestions.

    Read the article

  • Windows 7 Blank Screen on Boot / Login

    - by Greg
    I have a new system that's having a few problems... sometimes (seems to be when the PC is cold, i.e. has been switched off for a while, though that could be my imagination) I get a blank blue screen when I boot up. The system boots normally and auto-logs-in. The desktop loads and I'm even able to launch applications, but then everything disappears and the screen goes to the default windows desktop blue colour (not the desktop image, just a plain blue with no mouse cursor). At this point the machine completely locks up - I'm unable to even toggle Num Lock and have to hold in the power button for 5 seconds to kill it. Interestingly if I manage to launch some applications before it goes blank, they will usually crash... sometimes explorer.exe will crash too. When I reboot, the system is fine and stable. I've installed the latest graphics drivers and run memtest86+ for 6 passes (and counting) with no errors. The system specs are: CPU: Intel I7 2.66 @ 3.4GHz RAM: 6GB (3 * 2GB DDR3) HDD: 128GB Crucial M225 SSD Motherboard: Gigabyte EX58-UD3R Gfx: ATI Radeon Sapphire 5870 1GB Note: There are a few similar questions but I haven't found one that matches my symptoms

    Read the article

  • How to use the AWUS036H on MacBook Pro with Lion and Backtrack in VM?

    - by Swader
    I have the AWUS036H USB WiFi adapter and have recently upgraded the OSX to Lion. The thing is, there are no drivers for Lion for the AWUS036H, and I would have to boot into 32bit mode every time I want to launch the adapter as per instructions here: http://www.youtube.com/watch?v=n9_HAGi1ce0 I also want to install BackTrack as I deal in networks a lot for my company. While this would be a simple matter on any other laptop, the company issued Macbook does not allow booting into any OS other than MacOSX or Windows with Bootcamp. Now, since dual booting into BT is not an option, I would like Backtrack to run in VM inside my MacOSX Lion - and this it does. It works like a charm inside VirtualBox. But since there are no 64bit drivers for the wifi adapter, Lion doesn't recognize it and cannot install it. This, in turn, means that Backtrack cannot see it even though AWUS036H usually works flawlessly with BT. How can I make my VM-based BT see the wifi adapter even if the parent OS doesn't see it, if at all? Is there a way, or am I better off buying a new WiFi adapter that supports OSX 10.7 such as the AWUS036NHR?

    Read the article

  • Ubuntu: how to get audio to work in both Spotify (under Wine) and Flash (in Firefox)?

    - by Jonik
    I'm running Spotify on Linux using Wine. Sound worked great (even though the sound test in winecfg failed!), until I installed alsa-oss package yesterday to get Flash sound working in Firefox. Now Spotify says: "There is a problem with your sound card. Spotify can't play music." So the question is, how to get the sound in Spotify working again, so that it also keeps working in Flash & Firefox? Tweak some ALSA settings? Spotify settings? Add/remove some packages? By the way, curiously, now that sound doesn't work in Spotify, winecfg's "Test Sound" does work! This is Ubuntu 8.04 (Hardy). Sound card / driver is probably an integrated AC'97. Please mention if any additional information about the system is needed! Update: I have Flash 10 installed (outside the packaging system, using $MOZ_PLUGIN_PATH env variable), but also had Flash 9 from flashplugin-nonfree package - and the earlier version was being used by Firefox! Based on what Mike Arthur said about Flash and alsa-oss, I removed the older Flash (flashplugin-nonfree package) and alsa-oss - and Flash sound still works, which is nice. But for some reason Spotify still doesn't play sound, even though things should now be like they were originally... Update 2: Got it working, all smoothly, finally.

    Read the article

  • Nginx + PHP5-FPM repeated cut outs 502

    - by James
    I've seen a number of questions here that highlight random 502 (Nginx + PHP-FPM = "Random" 502 Bad Gateway) and similar time outs when using Nginx + PHP-FPM. Even with all the questions, I'm still unable to find a solution. Using Ubuntu 10.10 + Nginx + PHP5-FPM + APC and every 1 out of 4 requests ends in a timeout and failure. This isn't a load issue or large traffic, it happens even in dev environment with one person. I am doing this across 3 1GB machines, each with the same configurations and same problems. fastcgi_params fastcgi_param QUERY_STRING $query_string; fastcgi_param REQUEST_METHOD $request_method; fastcgi_param CONTENT_TYPE $content_type; fastcgi_param CONTENT_LENGTH $content_length; fastcgi_param SCRIPT_NAME $fastcgi_script_name; fastcgi_param REQUEST_URI $request_uri; fastcgi_param DOCUMENT_URI $document_uri; fastcgi_param DOCUMENT_ROOT $document_root; fastcgi_param SERVER_PROTOCOL $server_protocol; fastcgi_param GATEWAY_INTERFACE CGI/1.1; fastcgi_param SERVER_SOFTWARE nginx/$nginx_version; fastcgi_param REMOTE_ADDR $remote_addr; fastcgi_param REMOTE_PORT $remote_port; fastcgi_param SERVER_ADDR $server_addr; fastcgi_param SERVER_PORT $server_port; fastcgi_param SERVER_NAME $server_name; fastcgi_param REDIRECT_STATUS 200; /etc/php5/fpm/main.conf ; FPM Configuration ; ;include=/etc/php5/fpm/*.conf ; Global Options ; pid = /var/run/php5-fpm.pid error_log = /var/log/php5-fpm.log ;log_level = notice ;emergency_restart_threshold = 0 ;emergency_restart_interval = 0 ;process_control_timeout = 0 ;daemonize = yes ; Pool Definitions ; include=/etc/php5/fpm/pool.d/*.conf /etc/php5/fpm/pool.d/www.conf [www] listen = 127.0.0.1:9000 ;listen.backlog = -1 ;listen.allowed_clients = 127.0.0.1 ;listen.owner = www-data ;listen.group = www-data ;listen.mode = 0666 user = www-data group = www-data ;pm.max_children = 50 pm.max_children = 15 ;pm.start_servers = 20 pm.min_spare_servers = 5 ;pm.max_spare_servers = 35 pm.max_spare_servers = 10 ;pm.max_requests = 500 ;pm.status_path = /status ;ping.path = /ping ;ping.response = pong request_terminate_timeout = 30 ;request_slowlog_timeout = 0 ;slowlog = /var/log/php-fpm.log.slow ;rlimit_files = 1024 ;rlimit_core = 0 ;chroot = chdir = /var/www ;catch_workers_output = yes

    Read the article

  • How to run a WebPy server on port 8080 using DDNS of dlink router and to access this site from internet?

    - by nuke1010
    I have two major issue with setting up a web server using my dlink DIR-600L router. Issue 1: I run a WebPy server on port 8080. But the DDNS service providers (like dlinkddns.com or dyndns.org) only allows port 80. I can run the server in port 80 with sudo command. But my server become vulnerable if i give root access. So I tried port forwarding in the router and server. But no use. I don't know if I done that correctly. Issue 2: Even though the server runs on port 80, I can access my site from my local machines only using registered domain names ( say, nikz.dyndns.org). No one on internet cannot load this site even when its totally up. As I observed server log, the request from other clients never reached my server. I need to run this server on port 8080 and i need to access this site from internet. How can I do it? any idea?

    Read the article

  • Getting Windows (VMware) to load from OSX's localhost without an Internet Connection

    - by Jonah Goldstein
    I'm using MAMP to host my local sites, and VirtualHostX so that I can access sites during local development via a convenient URL like mysite.dev I'm also running Windows XP via VirtualBox, and it would be great to be able to load up any of my local sites within windows while offline as currently often working without access, on the move, unfortunately. I know that I can append my IP and a nice domain name to the host file in C:/WINDOWS/system32/drivers/etc ... and i can find my IP simply through terminal with "ifconfig" while I'm online. The problem is that when I'm not online, there's no IP. Even if there is an IP (when i have a connection), I still have grab it and update the windows hosts' file all the time, since I'm developing from a laptop and have a new IP at the drop of a dime. I found a tutorial where the author is able to get a permanent IP. He uses VMware Fusion as his VMachine, which is the only difference between his setup and mine. By running the terminal command "ifconfig vmnet1" he gets: a secret IP the virtual machine uses to talk to OSX And that doesn't change - which is awesome. I'm assuming it exists even if he's offline. His tutorial is here, http://bit.ly/U2lq It would be pretty fantabulous if I could replicate this with virtualBox. Anyone have ideas? Thanks:)

    Read the article

  • IIS 7.5 Siteminder is not protecting ASP.net MVC requests

    - by HariM
    We are trying to use ASP.Net MVC with Siteminder for Single Sign on. This is on Windows Server 2008 R2 with IIS 7.5. Siteminder Agent version 6QMR6. Problem : Siteminder protects physical files that are exist. And it is not protecting the folder when we try to access a non existed file. It must redirect to login page even if the file doesn't exist when the user is accessing a protected folder. How to configure in IIS 7.5 that Do not verify a file exist, before authentication by siteminder. SiteMinderWebAgent is a Handler(WildCard Script Map) we created using the ISAPI6WebAgent.dll How to Protect ASP.Net MVC Request with Siteminder? (Added this as My previous question did not solve the problem). MVC Request shows up in IIS Log but not in Siteminder log. Update : Microsoft Support says currently IIS7.5, even in earlier versions doesnt support wildcard mappings on any two Isapi Handlers with * wild card. Currently in my case Siteminder has * wildcard and asp.net mvc (handler is aspnet_isapi) has * wildcard to handle the reqeusts. Ordered priority doesnt work in the wild card mappings case with Just *. Did not convinced with the answer but will wait till tomorrow for them to get back.

    Read the article

  • Wireless Activity Monitoring for PCI DSS Compliance

    - by dkusleika
    In an effort to be PCI DSS compliant, I took a trustkeeper.net questionnaire. I failed the question that asks Is the presence of wireless access points tested for by using a wireless analyzer at least quarterly or by deploying a wireless IDS/IPS to identify all wireless devices in use? (SAQ #11.1) My only wireless access point is outside my firewall, so even if you cracked my wireless you couldn't get inside my domain (unless you crack that too). My firewall doesn't have IPS and I couldn't tell if it had IDS. I looked around for a wireless analyzer, but what I found was $500, which is a little pricey for my size business. And even if I got it, I'm not sure I would understand what it tells me. Surely there are smaller/less sophisticated businesses that take credit cards and have solved this. My questions are: What are the risks if someone were to crack my wireless? (Could they read all internet traffic? Just wireless traffic? Just use my internet connection?) And what is the best/cheapest way to test my connection point quarterly? Should I buy the $500 analyzer? Domain is Windows Server 2000. Firewall is Sonicwall Pro 2040. Router is 8 port D-link.

    Read the article

  • Acer recovering windows vista

    - by Charlie Pigarelli
    My computer history is very long even of my computer has 4 years of life. An year ago I installed Windows 7 on this acer m1610 that had Vista before. My technic left me 2 recovery disc for "acer vista" before updating it to Windows 7. Then the computer had some trouble. The graphic card broke and we decided to use another computer. Yesterday I had the great idea to fuse the two computer to have a better one... So I moved the graphic card of the latter computer to the acer and everything gone well. Then the trouble of speed, it had before, come back. So I decided to reinstall the very first Windows: Vista back again. I booted the computer with those 2 DVD-R my technic left me and at the end of the process it asked me to insert "the backup cd number one or the system disk". I found 2 original Acer "Blank Recovery Disc" DVD-R and tried with those: rejected. Tried with empty DVD-+R: rejected. I tried with CDs: rejected. I don't have any system disc with me. Except for those 2 DVD-R my technic left me. What am I supposed to do now? I even tried with these fantomatic alt+f9/f10 that should start the recovery without any disc... But nothing happened. PS: the installation cannot complete if I do not insert the right disk. (The recovery disc uses Acer eRecovery Management as recovery software.

    Read the article

  • Suggestions for Windows 8 migration [closed]

    - by Big Endian
    I'm thinking of migrating to Windows 8. At first I hated it, but I'm pretty sure the Windows 8 model is the future, and I don't particularly want to end up hating the future like my parents, frustrated and bewildered by anything past Windows XP. I'm currently running Windows 7 and my system has been accumulating some problems. It's probably an accumulation of issues from installing too much software, changing firewall settings, installing Ubuntu alongside Windows, and... well I'm not sure, but my computer has been buggy in unexpected ways lately (freezing and unfreezing, display driver crashing and recovering, and what I call "deep freeze/thaw cycle" where the mouse won't even move for a while). I'm good at solving computer problems, but I can't seem to get to the root of these and my best idea for fixing them is making sure I've backed up every file then re-installing the entire OS. Luckily for me, a new OS is just around the corner so this would be a good time to get two things out of the way at once. The problem I see is that the upgrade options I see are all "seamless". I don't want a seamless upgrade. I want to wipe the slate clean and start all over. Does this mean I will have to buy a full, new copy of Windows 8 rather than one of the cheaper upgrading options? Or does it not make since for me to go to Windows 8 given that I have a laptop, not a tablet? Maybe I should just re-install Windows 7, or even call good enough good enough, try to eliminate the bugs, and start with a fresh slate in 2-3 years after this computer eventually dies entirely from (inevitable) hardware failure. What would be the advantages or disadvantages and costs of each option, how would I go about upgrading to Windows 8 if that's the option I choose, and what is your personal opinion about my situation?

    Read the article

  • Is there an high quality natural text reader for the mac?

    - by Another Registered User
    I'm reading about 150 pages of text on screen, every day. I will have to read about 15.000 in the next upcoming months. No joke. Well, the problem is this: I suffer from a sort of attention deficit hyperactivity disorder which forces me to read every sentence up to 10 times until I really get it. Mac OS X Snow Leopard has a built-in text reader with the name "Alex". Although it is already pretty good quality, I know there are far better natural sounding voices out there. I have heard already voices that are absolutely amazing compared to Alex. They're so good, that you can't tell anymore the difference between a real person or a computer. Alex still has this "metal factor" in its voice, which makes my ears hurt after 8 hours of listening. The next problem with Alex is, that he never makes a break after a sentence. Also, it's not possible to think about a sentence and then continue reading. It's also not possible to have him repeat a sentence, without tedious text selection and shortcut usage. Actually, the best tool I can imagine would have the option to read a sentence and move on to the next one after pressing a special key, OR repeating the previously one after pressing a special key. That would help so much! And if that's even with one of those bell lab / AT&T / whatever super-natural voices, even better! But it would be already a great relief if there was just a better tool to control Alex. To let him make breaks after sentences or let him speak big chunks of text sentence-by-sentence with fine-grained control over repetition and moving on. Is there anything?

    Read the article

  • User directive in nginx generates error despite running as UID root

    - by Joost Schuur
    I'm running nginx on a MacOS X machine, installed with brew, and when I launch nginx, even with sudo, I get the following warning in my log file over and over again: 4/21/11 2:03:42 AM org.nginx[3788] nginx: [warn] the "user" directive makes sense only if the master process runs with super-user privileges, ignored in /usr/local/etc/nginx/conf/nginx.conf:2 From nginx.conf: user jschuur staff; I'm already launching nginx with sudo, since I want the thing to listen on port 80. Shouldn't that be enough to give it the proper super user privileges? The nginx binary as it's installed: jschuur@Glenna:sbin ? master ls -la total 4544 drwxr-xr-x 3 jschuur staff 102 Apr 12 20:53 . drwxrwxr-x 15 jschuur staff 510 Apr 12 15:25 .. -rwxr-xr-x 1 jschuur staff 2325648 Apr 12 20:39 nginx FWIW, I recompiled the binary to set passenger up and moved it around from it's original location into /usr/local/sbin. Update: As it turns out MacOS X was restarting nginx after I'd stopped it, because the launchd plist in ~/Library/LaunchAgents had set it to 'KeepAlive'. However, because I installed this plist into my local user's LaunchAgents folder as opposed to /Library/LaunchAgents (or better yet /Library/LaunchDaemons, which run before you even log on), it wasn't executed as root. Because of an error about not having permissions to use port 80, it actually exited right away, but still wrote to the same log file as the nginx process I started with sudo. I had thought the errors stemming from the automatic restart were actually coming from my manual restart via sudo. So, bottom line, problem solved. The real problem here was the homebrew instructions specifically asking you to install the plist file into an area that wouldn't allow a local site to use port 80.

    Read the article

  • How to remap IPs visible from local machine to IPs visible from a machine I have SSH access to?

    - by gooli
    I'm so far out of my depth I don't even know what to google for. There's a server I can connect to via SSH. Via that server I can access other server on its subnet via SSH. What I want to do is be able to access the machines that server has access to directly. Say the server IP is 192.168.7.7 and is the only one in the 192.168.x.x range I have access to. I'd like to configure things in such a way that when I to access say 192.168.7.100 on my machine, the connection will go through an SSH tunnel I open to 192.168.7.7 and out to 192.168.7.100. I would like this to work for any port if at all possible. I know I can set an HTTP proxy and even a SOCKS proxy, but I'm wondering is there is a way to actually remap some of the IP my machine sees to IP only visible from the remote machine. What would this configuration be called? IS this NAT, VPN, IP2IP or something else? How can I set up this on a Windows client box that connects via SSH to a Linux box? Sounds to me like I need to set up some kind of filtering on the network driver or possibly a virtual NIC, but I'm not sure where to go next.

    Read the article

  • Very uneven CPU utilization with SQL Server 2012 on 2 processor computer with 16 cores / processor

    - by cooplarsh
    After installing SQL Server Enterprise 2012 with the Server + Cal license model, on a computer with 2 processors each with 16 cores (and no hyperthreading involved) and putting the server under extremely heavy load the 16 cores on the first processor were very underutilized, the first 4 cores on the 2nd CPU were heavily utilized, and the last 12 cores were not used at all (because of the 20 core limit for this sql server version). Total CPU utilization was displaying as around 25%. Unfortunately, the server suffered from extremely poor performance even though if the tasks were evenly distributed across the 20 cores it wouldn't have been anywhere near as bad. The Windows Server was running on a VMWare virtual image under ESX Server, but all of the CPU was allocated to the windows server. We tried changing affinity settings (e.g., allocating most cores to CPU and the others to I/O), but that didn't help solve the performance problems. Upgrading the product edition to SQL Server Enterprise Core 2012 not only allowed the SQL Server to utilize the 12 previously unused cores on the 2nd processor, but it also resulted in a much more even distribution of tasks across all of the processors. To get through the backlog of requests cpU utilization jumped to around 90%, and then came down to around 33% once it was caught up, but performance improved dramatically since we failed over to the newly updated version And the performance issues went away. I was wondering if anyone knows what might cause SQL Server to unevenly distribute the load, relying almost exclusively on the first 4 cores of the 2nd processor that had 12 cores idle, and allocate only a few tasks to each of the 16 cores on the first processor. Also, is there any way we could have more evenly distributed the load across the 20 cores that were being used without the product edition upgrade? The flip side of that question is what did the product upgrade do that caused SQL Server to start evenly distributing the load across all of the cores that it recognized? Thanks to any insight to answer these questions and/or links that might help me better understand how to make sense of what was happenings.

    Read the article

  • sudo or acl or setuid/setgid ?

    - by Xavier Maillard
    Hi, for a reason I do not really understand, everyone wants sudo for all and everything. At work we even have as many entries as there are way to read a logfile (head/tail/cat/more, ...). I think, sudo is defeating here. I'd rather use a mix of setgid/setuid directories and add ACL here and there but I really need to know what are the best practices before starting up. Our servers have %admin, %production, %dba, %users -i.e many groups and many users. Each service (mysql, apache, ...) has its own way to install privileges but members of the %production group must be able to consult configuration file or even log files. There is still the solution to add them into the right groups (mysql...) and set the good permission. But I do not want to usermod all users, I do not want to modify standards permissions since it could change after each upgrade. On the other hand, setting acls and/or mixing setuid/setgid on directories is something I could easily do without "defacing" the standard distribution. What do you think about this ? Taking the mysql example, that would look like this: setfacl d:g:production:rx,d:other::---,g:production:rx,other::--- /var/log/mysql /etc/mysql Do you think this is good practise or should I definetely usermod -G mysql and play with standard permissions system ? Thank you

    Read the article

  • Is it possible to install all packages from an APT repository?

    - by Kristoffer Hagen
    Is it possible to install all packages from an APT repository? I know it is possible to do it manually, but then you would need to know all the package names, and I don't. Any suggestions? Thanks. Update: Well, you guys are going to kill me for this, but the reason for my madness is that I want to install all the packages from BackTrack into my Ubuntu installation. I really don't like the idea of having it in a VM and having a separate partition for it is even more out of the question. I know that the folks at BackTrack doesn't like it when people leech their repositories, but that's what you get for releasing open source software. Stupid? maybe.. A valid reason? probably not.. Do I still want it? Yes. Another edit: I have now given up on this as it seems impossible to get it to work even by manually installing packages.

    Read the article

< Previous Page | 212 213 214 215 216 217 218 219 220 221 222 223  | Next Page >