Search Results

Search found 840 results on 34 pages for 'jim maguire'.

Page 14/34 | < Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >

  • Why does a Remote Desktop Redirected Printer Doc appear every time I connect to Windows Server 2003 SBS?

    - by Jim Dagg
    I've run into a weird, persistent issue regarding a remote desktop connection. Every time I successfully log into a server running Windows Server 2003 SBS, without taking any further action, after a few seconds a print job spontaneously appears on my machine, titled "Remote Desktop Redirected Printer Doc". The document is 4K, datatype RAW, processor "WinPrint". I've heard of people running into this issue before, but can't seem to hunt down a coherent solution. It's a minor annoyance, but I get annoyed when Windows complains about a print job that, as far as I know, came from nowhere. Any thoughts on why this would occur and how I could prevent it from happening?

    Read the article

  • How can I setup BluePill to Monitor a Rails App Running via Passenger (mod_rails)

    - by Jim Jeffers
    I recently launched a site running phusion passenger. Unfortunately, the site went down due to a frozen thread. I was able to save the server by doing kill -9 to the specific PID. Still though, I thought passenger was able to manage this automatically. I have a server with 1GB of memory running one rails app with passenger allotted up to 7 instances. However, when I came to discover the site went down I found that passenger had spawned 6 instances with one of them using up over 800mb of memory causing the server to swap. As a result I am hoping to setup something like bluepill on the server but I'm slightly confused as to how you go about doing it. Mainly because bluepill expects to start/stop the processes it's monitoring. However, in our case, passenger already restarts processes for us so we only need to monitor the pids of passengers instances and kill them once they've gotten too large. Has anyone here setup BluePill to monitor a rails app running under phusion's passenger? Any insight would be useful.

    Read the article

  • Home movie video browser

    - by Jim Hunziker
    I have a bunch of home movies that don't have useful filenames because they came straight off the camera. (I'm using Vista 64, by the way.) Picasa is pretty good for browsing through them and watching them, but it doesn't use my video card for rendering the videos. My CPU gets pegged at max usage, and full screen barely works. Windows Media Player or Quicktime works fine. Is there another application (like Picasa) that can be used for browsing through movies that both uses my video card and shows thumbnails of all the movies in my collection? I'd rather have something nicer than Windows Explorer. (The movies are h.264 AAC 1280x720.)

    Read the article

  • Implications of Multiple JobTracker nodes in a Hadoop cluster?

    - by Jim Dennis
    I get the impression that one can, potentially, have multiple JobTracker nodes configured to share the same set of MR (TaskTracker) nodes. I know that, conventionally, all the nodes in a Hadoop cluster should have the same set of configuration files (conventionally under /etc/hadoop/conf/ --- at least for the Cloudera Distribution of Hadoop (CDH). Can we define multiple Job Trackers in mapred-site.xml? Something like: <configuration> <property> <name>mapred.job.tracker</name> <value>jt01.mydomain.not:8021</value> </property> <property> <name>mapred.job.tracker</name> <value>jt02.mydomain.not:8021</value> </property> ... </configuration> Or is there some other allowed syntax for this? What are the implications of doing this. Does each JobTracker get information about the load on each TaskTracker node. In other words can the two JobTracker co-ordinated their scheduling across the TT nodes only based on the gossip information from the TTs or would they need to talk to one another? Is this documented anywhere?

    Read the article

  • Are all SATA cables compatible with SATA 3?

    - by Jim Fell
    I have a HP Compaq de5700 Small Form Factor desktop computer, and I am looking to upgrade it's hard drive. When I open up the box, it clearly has available SATA connectors on the motherboard, but no indication as to which SATA version (1, 2, or 3). The hard drive I am considering is a SATA 3. My concern is that if the motherboard also supports SATA 3 and I use an old SATA cable (v1 or v2), might there be problems? This is a bare drive, so I don't expect that a cable will come with it, and I have not been able to find the manual for this machine. Thanks.

    Read the article

  • Random "not accessible" "you might not have permission to use this network resource"

    - by Jim Fred
    A couple of computers, both Win7-64 can connect to shares on a NAS server, at least most of the time. At random intervals, these Win7-64 computers cannot access some shares but can access others on the same NAS. When access is denied, a dialog box appears saying "\\myServer\MyShare02 not accessible...you might not have permission to use this network resource..." Other shares, say \\myServer\MyShare01, ARE accessible from the affected computers and yet other computers CAN access the affected shares. Reboots of the affected computers seem to allow the affected computer to connect to the affected shares - but then, getting a cup of coffee seems to help too. When the problem appears, the network seems to be ok e.g. the affected computers can access other shares on the affected server and can ping etc. Also Other computers can access the affected shares. The NAS server is a NetGear ReadyNas Pro. The problem might be on the NAS side such as a resource limitation but since only 2 Win7-64 PCs seem to be affected the most, the problem could be on the PC side - I'm not sure yet. I of course searched for solutions and found several tips addressing initial connection problems (use correct workgroup name, use IP address instead of server name, remove security restrictions etc) but none of those remedies address the random nature of this problem.

    Read the article

  • Combine multiple rows into one

    - by Jim
    I am trying to combine multiple rows of data into one. Column A contains the value on which the groupings will be based -- rows whose Column A values match will be combined into one row. My range extends from column A through X so I need a matching row of data to start in column Y. Example: +--------------+ ¦ 1001 ¦ A ¦ C ¦ ¦ 1001 ¦ B ¦ D ¦ ¦ 1002 ¦ A ¦ E ¦ ¦ 1002 ¦ B ¦ F ¦ ¦ 1002 ¦ C ¦ G ¦ +--------------+ Desired Result: +------------------------------+ ¦ 1001 ¦ A ¦ C ¦ B ¦ D ¦ ¦ ¦ ¦ 1002 ¦ A ¦ E ¦ B ¦ F ¦ C ¦ G ¦ +------------------------------+ The VBA code I am currently using is not taking the entire contents of the matched row. It is only taking the data in the 2nd column and moving it up. VBA Code: Sub Mergeitems() Dim cl As Range Dim rw As Range Set rw = ActiveCell Do While rw <> "" ' for each row in data set ' find first empty cell on row Set cl = rw.Offset(0, 1) Do While cl <> "" Set cl = cl.Offset(0, 1) Loop ' if next row needs to be processed... Do While rw = rw.Offset(1, 0) cl = rw.Offset(1, 1) ' move the data Set cl = cl.Offset(0, 1) ' update pointer to next blank cell rw.Offset(1, 0).EntireRow.Delete xlShiftUp ' delete old data Loop ' next row Set rw = rw.Offset(1, 0) Loop End Sub

    Read the article

  • Rails /tmp/cache/assets permissions issue using Debian virtual machine hosted on OS X Lion

    - by Jim
    I am running Parallels Desktop 7 on OS X Lion. I have a VM with Debian installed, and inside that VM I setup a Rails development environment. I am using Parallels Tools to share out my OS X home directory to the VM - the goal here is to run the Rails server on the VM, but host the files on OS X (so they are automatically backed up, and so I can use tools like Textmate to develop with). Everything seems to work with the shared directory - my Debian user can read, write, and execute files. However, when I cloned a recent Rails project from Git, I got an error message when it tried to compile the CSS assets. My symptoms are exactly the same as in the question: http://stackoverflow.com/questions/7556774/rails-sprocket-error-compiling-css-assest-chown-issue I believe this is permissions-based, but it is really weird. My entire Rails project directory has permissions set to 777 and my Debian user owns it. If I navigate into /tmp/cache/assets, those permissions are the same. However, the three-character directories Rails is creating (DCE, DA1, D05, etc...) are being created without write permissions! If I refresh the Rails page a few times, about 4 or 5 (with Rails creating new three-character directories every time), eventually it will create one of the directories with the proper 777 permissions and everything will work! This will persist until I make a change to the CSS files and it has to recompile. Does anyone have any idea what might be going on here? I can't fathom why it is creating temp directories with incorrect permissions, or why after a few refreshes the good permissions kick in and it works... It definitely seems to be an issue with the share, since if I move the project into a different directory on the VM, it seems to work fine. On the OS X side, I've given the shared folder 777 permissions as well, but no dice...any ideas? Update I've found that the number of times I need to refresh before it works is not random - it has to do with how many assets are being compiled. For example, if I edit one of my CSS files, and there are four CSS files in the app/assets/stylesheets directory, I have to refresh four times before the app will finally work without the operation not permitted error...

    Read the article

  • Can't run utilities/.exe's that use the network from a [DFS] windows share on Windows 2008 servers. Can this be overcome?

    - by Jim Lawhon
    Under Windows Server 2008 I'm unable to run many utilities that use network resources. This works just fine under Windows Server 2003. For example: \\domain\dfs\tools$\bin\sendmail.exe ... \\domain\dfs\tools$\bin\psexec.exe ... echo %_metric% %_value% %_unixtime% | \\domain\dfs\bin\foo$\nc graphite.domain 2003 -w1 Reproducing and maintaining this folder on a large number of servers/vm's is not desirable. Is there a way to allow Windows Server 2008 to run these tools? If so, can this be enabled via GPO or in a fashion that can be scripted during automated builds? Edit: The commands/tools do work just fine, when run from local drives. Edit2: Wget example: d:\scripts\helpers>z:\bin\wget http://www.google.com SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = z:/etc/wgetrc --2011-04-11 00:32:15-- http://www.google.com/ Resolving www.google.com... failed: Host not found. z:\bin\wget: unable to resolve host address `www.google.com' wget can neither use DNS to resolve the IP nor can it use HTTP if provided an IP directly. Edit3: The problem seems to be tied to DFS/DFS shares. Tools run correctly from other normal windows-server file-shares. They also run correctly when run directly from the file-servers behind the DFS. They only fail when we attempt to run them from the DFS UNC path or mapped drives.

    Read the article

  • Home movie video browser

    - by Jim Hunziker
    I have a bunch of home movies that don't have useful filenames because they came straight off the camera. (I'm using Vista 64, by the way.) Picasa is pretty good for browsing through them and watching them, but it doesn't use my video card for rendering the videos. My CPU gets pegged at max usage, and full screen barely works. Windows Media Player or Quicktime works fine. Is there another application (like Picasa) that can be used for browsing through movies that both uses my video card and shows thumbnails of all the movies in my collection? I'd rather have something nicer than Windows Explorer. (The movies are h.264 AAC 1280x720.)

    Read the article

  • Start Chrome by command line, but adding some arguments to make it login into your Google account automatically

    - by jim
    Is there a way to start Chrome calling it from the command line (using Linux), but providing it some argument to make it login into some Google account automatically? I'm looking for something like google-chrome -account foo -pass bar that I can easily put in a bash script later. A little background: I have a laptop connected to my TV, which is currently using just a mouse for user interaction. There's no google account logged in by default, and that's the way I want to keep it, so my kids can't come across videos and pictures in google and youtube that they are not supposed to see (e.g.: adult content, or anything marked as not appropriate for kids by the google's safe search filters). The bad thing about this is that there are some music videos in youtube that requires you to be logged in to see, usually those we (the adults) used to sing when playing karaoke... as the only input available is a mouse, I'm looking for a way to start with my google account without having to type the whole thing usin the on-screen keyboard. You may think "Why you can't use the keyboard, if the laptop is right there?". Well, it's in a kind of uncomfortable position - too high for me without a chair or something, as it's right above the furniture in where the TV is located. Is there a way to make this scriptable? If not, do you know any other workaround? Note: using the remember me after logging off or alike options are discarded, as the safe-search chrome version must be always the default version to run.

    Read the article

  • Migrating from one linux install to another: How to keep the second disk around?

    - by Jim Miller
    I've got a linux box running Fedora 19 that I want to move to CentOS 6.4. Rather than trying to do something fancy with the current disk (which has also accumulated a lot of sludge over the years), I'm going to get a new disk, put CentOS on that, and then move the to-be-preserved bits of stuff from the old disk to the new one. I haven't done this yet, but I presume it should be semi-straightforward -- do the CentOS install on the new disk, mount the old disk on /olddisk or somesuch, and start copying. However, I'm not sure how to handle getting the machine to recognize the new empty disk as the target of the CentOS install (I suppose I can just pull the old disk during the installation), remember that this is the intended boot disk once the install has happened), and tweak /etc/fstab (right?) to set up the old disk on the desired mount point. (Both disks are, or will be, SATA.) I could probably hack it together without losing too much hair or doing too much damage, but could anyone offer some advice that would get/keep me on the right track? Thanks!

    Read the article

  • Standalone tool to synchronize two folders in windows?

    - by Jim
    I keep data on a USB drive, but I also keep a copy of all of that data on a hard disk. I like using the hard disk because it's faster and gives me a backup. What standalone tools would work to keep the files on the disk and USB drive in sync? I'd like a single command line executable or standalone GUI app that can do the job--something I could run off of the USB drive. So, things like the MS Sync Tool wouldn't work.

    Read the article

  • grep + sed for find & replace fun!

    - by Jim Greenleaf
    I have a dev copy of a website set up that has quite a few hardcoded references to its live counterpart. I would like to replace all occurrences of "www." with "dev." in all files. I think I can use a combination of grep + sed, but I'm not sure how.

    Read the article

  • IIS 6 Denies access to the default document

    - by Jim
    I've got Windows Server 2k3 with IIS6 hosting a couple ASP.NET MVC 2 applications (.NET 4), all in the Default Web Site. Most of them simply use Integrated authentication, but a couple use forms as well. All the applications work properly and are correctly accessible. The problem I'm trying to resolve is access to the default document. It is currently specified as index.htm. Both index.htm and the Default Web Site are configured to allow anonymous access (with none of the authenticated acces boxes checked). However, access is denied to the file. Accessing via server.domain.tld/ and server.domain.tld/index.htm both yield 401 errors. However, server.domain.tld/default.htm (file does not exist) properly returns a 404. If I alter the file security on index.htm to allow integrated authentication, then requesting /index.htm directly works properly for users with domain accounts, but anonymous users get a login prompt/401. How can I configure IIS to allow all users to view index.htm via server.domain.tld/?

    Read the article

  • Dell XPS M1330 - power cable pulled by accident and now it won't turn on

    - by jim
    I have a similar problem to what has been posted on this site about when I plug in my Laptop adapter the green light comes on as expected; but, when I plug it into the laptop it goes out. In my case, I know it's not the adapter because I have 2 and they both experience the same issue. I'm quite certain the problem is a short in the laptop. I was using the laptop today and the power cord was pulled out by accident and now i'm into this predicament. How or what do I check on the laptop to isolate the problem? Thanks!

    Read the article

  • Color highlights have vanished in emacs

    - by Jim Kiley
    I'm using emacs on a remote Linux server that I access via ssh. I'm editing C files that have a non-standard suffix, so I have had to manually enter c-mode with M-x c-mode every time I open one of those files. I found this to be annoying so I started monkeying with my .emacs to make that problem go away. This made all the color highlights in c-mode go away instead. Correction: All my color highlights are gone. I've removed the .emacs file, logged out and logged back in, but now, the color highlights are gone. I miss them! They were very helpful How do I get them back?

    Read the article

  • Rpm removal does not remove delivered dirs and leaves garbage

    - by Jim
    I deliver an application via an RPM. This application delivers various directories and files. E.g. under /opt/internal/com a file structure is being copied. I was expecting that on rpm -e all the file structure delivered under /opt/internal/com will be removed. But it does not. There are directories in the file structure that are non-empty. Is this the reason? But these (non-empty) directories were created by the RPM installation. So I would expect that they would be "owned" by RPM and removed automatically. Is this wrong? Am I supposed to remove them manually?

    Read the article

  • Minimal backup for Windows 7 system recovery

    - by JIm
    There might not be an answer to this, but for a home Win7 system, what files/directories must be backed up to recover after a windows crash? I can reinstall software, and I keep data files elsewhere. When I use acronis home backup software to backup my "critical" files it seems to choose the entire partition. Updates are mostly browser cache files and the like. Or, after a crash, should I just reinstall windows. I dread the hours of windows updates that would require. Thanks.

    Read the article

  • Delete a windows group in Active Directory

    - by Jim
    I am doing a cleanup of some AD groups that are no longer used. One of the AD groups I could not delete because it seems that a member has this group set as the primary group (which I assume someone did by accident). Is there an easy way to find out who has this group set as primary?

    Read the article

  • What music streaming app fits my needs on Ubuntu?

    - by Jim
    I'm looking for an application to stream Internet radio on Ubuntu. I like listening to Radio Paradise while I work. Right now, I'm using Amarok. "Movie Player" sometimes refuses to open the stream, and VLC doesn't keep its window title updated with the currently playing track. Amarok has nice translucent notifications when tracks change, but track changes in streams don't trigger the notifications. Mostly, I want something that reliably opens streams and makes it easy to see the name of the track that's playing. If it has a built-in directory of streaming radio stations, that would be a big benefit.

    Read the article

  • Parallels: How to see a Mac-hosted website from Windows?

    - by Jim Miller
    I'm traveling at the moment, and have moved one of the websites I'm working on to my MBP so I can work on it without a network connection. I've made an addition to the Mac's /etc/hosts file pointing the domain name to 127.0.0.1, and all's well. I now want to get into Parallels and check the site from Windows browsers. How do I get things so that the Windows browser will understand the domain name and access the site? The Windows image obviously doesn't recognize / can't find the Mac's /etc/hosts file, and references to 127.0.0.1 in the Windows hosts file just as obviously point to Windows, not the Mac. Any advice out there? Thanks!

    Read the article

  • Finding throuput of CPU and Hardrive on Solaris

    - by Jim
    How do I find the throughput of a CPU and the hard disk on an OpenSolaris machine? Using mpstat or iostat? I'm having a hard time identifying the throughput if it is given at all in the commands output. For example, in mpstat there is very little explanation as to what the columns mean. I've been using the syscl column divided by time interval to find the throughput but to be honest I have no idea what a system call truly is. I'm trying to to analyze a hardrive and CPU while writing a file to the hardisk and when at rest.

    Read the article

  • In place SQL 2008 upgrade vs. Side by side?

    - by Jim
    I have a SQL 2005 Std edition server with 5 databases in production, 4 db's are used by web-based apps the 5th is a desktop application. My question is should I perform an in-place upgrade or a side-by-side by creating an sql2008 instance on the same box? The machine is a VM on vmware and I'm planning on taking a snapshot before the upgrade and having a 'blackout' window during the upgrade so that I could roll back to the snapshot if things go really bad. Any previous experience and advice is appreciated.

    Read the article

< Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >