Search Results

Search found 7175 results on 287 pages for 'job hunting'.

Page 70/287 | < Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >

  • Do you work in your server room?

    - by Gary Richardson
    I once had a job offer from a company that wanted my workstation to be in the AC controlled, noisy server room with no natural light. I'm not sure what their motivation was. Possibly it made sense to them for me to be close to the servers, or possibly they wanted to save the desk space for other employees. I turned down the job (for many reasons, including the working environment). Is this a common practice? Do you work in your LAN room? How do you cope?

    Read the article

  • Software for printing small photos on larger paper with easy layout snapping

    - by ldigas
    Have some photos and pictures to print (graphs, but that doesn't matter here). So far when printing I inserted them in MS Word, played with layout on the paper, and after half a day, finally managed to get it just right. Usually then, the power runs out :-) Anyways, does anyone know of any software which enables me to easily put up some kind of a layout (one under the other, or some rectangular grid) where I can just "snap" photos next to each other, before printing the whole thing. I know I can do that in pretty much any photo editing program, but the problem is that that pixel hunting takes time ... and is generally a very annoying process, expecially when you don't want to edit the photos in mind, just print them out. Anyways, I'm sure you get the general idea ... Suggestions ?

    Read the article

  • Migrating to LDAP

    - by Frank Brenner
    Hi Folks, I've started a new job at a house where they've got an amazingly unruly patchwork of Linux, xBSD, and OpenSolaris boxes. Every box has its own user auth using local /etc/passwd, etc. Users/Groups have differing uids/gids on each machine, and each machine has its own /home/ tree. (no central NAS /homes) My job is get get everything into an LDAP directory and use that for login auth. How do I get LDAP to deal with the differing uids/gids? Thanks.

    Read the article

  • Animated HTTP request visualisation on Apache

    - by Simon Bennett
    This is more a question to appease my memory in trying to remember what it was I saw a while ago. I remember being introduced to a realtime server visualisation tool that showed the current requests that Apache was handling in a kind of fireworks effect on screen. Each request/group of requests would be shot across the screen in varying colours. I can't for the life for me remember what is was called and hunting around here and Google has left me empty handed. Just wondering if anybody else was able to plug this gem from the memory and ease my pain! Thanks

    Read the article

  • Why does a Remote Desktop Redirected Printer Doc appear every time I connect to Windows Server 2003 SBS?

    - by Jim Dagg
    I've run into a weird, persistent issue regarding a remote desktop connection. Every time I successfully log into a server running Windows Server 2003 SBS, without taking any further action, after a few seconds a print job spontaneously appears on my machine, titled "Remote Desktop Redirected Printer Doc". The document is 4K, datatype RAW, processor "WinPrint". I've heard of people running into this issue before, but can't seem to hunt down a coherent solution. It's a minor annoyance, but I get annoyed when Windows complains about a print job that, as far as I know, came from nowhere. Any thoughts on why this would occur and how I could prevent it from happening?

    Read the article

  • can't send with postfix but I can whith one user

    - by CvR_XX
    I have a postfix and dovecot server but when i try to send an email i get an time -out. Im trying to send with the email [email protected]. A telnet session isn't helping much ether. I get a blank screen. Local it's working fine. My smtp service is running on treadity.com:25. The strange thing is that the logs are completely empty with any info regarding sending emails. Receiving is working alright. Another strange thing is that i've send some message's and that it worked. But that is only with one email. I can still send from that account but other emails are failing any idea's? config file: # See /usr/share/postfix/main.cf.dist for a commented, more complete version # Debian specific: Specifying a file name will cause the first # line of that file to be used as the name. The Debian default # is /etc/mailname. #myorigin = /etc/mailname smtpd_banner = $myhostname ESMTP $mail_name (Debian/GNU) biff = no # appending .domain is the MUA's job. append_dot_mydomain = no # Uncomment the next line to generate "delayed mail" warnings #delay_warning_time = 4h readme_directory = no # TLS parameters #smtpd_tls_cert_file=/etc/ssl/certs/ssl-cert-snakeoil.pem #smtpd_tls_key_file=/etc/ssl/private/ssl-cert-snakeoil.key #smtpd_use_tls=yes #smtpd_tls_session_cache_database = btree:${data_directory}/smtpd_scache #smtp_tls_session_cache_database = btree:${data_directory}/smtp_scache smtpd_tls_cert_file=/etc/ssl/certs/dovecot.pem smtpd_tls_key_file=/etc/ssl/private/dovecot.pem smtpd_use_tls=yes # See /usr/share/postfix/main.cf.dist for a commented, more complete version # Debian specific: Specifying a file name will cause the first # line of that file to be used as the name. The Debian default # is /etc/mailname. #myorigin = /etc/mailname smtpd_banner = $myhostname ESMTP $mail_name (Debian/GNU) biff = no # appending .domain is the MUA's job. append_dot_mydomain = no # Uncomment the next line to generate "delayed mail" warnings #delay_warning_time = 4h readme_directory = no # TLS parameters #smtpd_tls_cert_file=/etc/ssl/certs/ssl-cert-snakeoil.pem #smtpd_tls_key_file=/etc/ssl/private/ssl-cert-snakeoil.key #smtpd_use_tls=yes #smtpd_tls_session_cache_database = btree:${data_directory}/smtpd_scache #smtp_tls_session_cache_database = btree:${data_directory}/smtp_scache smtpd_tls_cert_file=/etc/ssl/certs/dovecot.pem smtpd_tls_key_file=/etc/ssl/private/dovecot.pem smtpd_use_tls=yes # See /usr/share/postfix/main.cf.dist for a commented, more complete version # Debian specific: Specifying a file name will cause the first # line of that file to be used as the name. The Debian default # is /etc/mailname. #myorigin = /etc/mailname smtpd_banner = $myhostname ESMTP $mail_name (Debian/GNU) biff = no # appending .domain is the MUA's job. append_dot_mydomain = no # Uncomment the next line to generate "delayed mail" warnings #delay_warning_time = 4h readme_directory = no # TLS parameters #smtpd_tls_cert_file=/etc/ssl/certs/ssl-cert-snakeoil.pem #smtpd_tls_key_file=/etc/ssl/private/ssl-cert-snakeoil.key #smtpd_use_tls=yes #smtpd_tls_session_cache_database = btree:${data_directory}/smtpd_scache #smtp_tls_session_cache_database = btree:${data_directory}/smtp_scache smtpd_tls_cert_file=/etc/ssl/certs/dovecot.pem smtpd_tls_key_file=/etc/ssl/private/dovecot.pem smtpd_use_tls=yes smtpd_tls_auth_only = yes #Enabling SMTP for authenticated users, and handing off authentication to Dovecot smtpd_sasl_type = dovecot smtpd_sasl_path = private/auth smtpd_sasl_auth_enable = yes 1,1 Top

    Read the article

  • Ethernet port sleeping on PS3 running linux

    - by Doug
    My lab has a PS3 running Ubuntu Linux 9.04 Server Edition. After a period of a few hours with no use, the Ethernet connection (eth0) seems to go to sleep, causing the connection to be lost. Pinging or trying to SSH into the machine results in no response. The fix I've been using is to access the machine locally and restart it (trying to bring eth0 down then up doesn't seem to correct it). I've tried setting up an hourly cron job that runs on the PS3 and pings another machine just to create network activity, but this doesn't seem to solve the problem either. Update: The solution was to run the above cron job much more frequently: every 10 minutes works.

    Read the article

  • sharepoint 3.0 site restore/import trouble...

    - by Trondh
    Hi, We have some old sharepoint data (from a WSS 3.0 SP1 or SP2 install), that I need to restore. Problem is: This is a time management site, and one of the fields automatically pics up the user name of the user that enters data, and this is used to keep track of who worked when. Now, when I import this into my temporary sharepoint 3.0 server, these fields are blanked, and the creator of the element is replaced by my admin user (the account that ran the import job). So, to the question: Is there any way at all to grab hold of these data before the sharepoint import job "destroys" them? I'm using stsadm -o import. I don't care if I have to pick the database itself apart manually, I just need to know if it's possible to get hold of these fields with data intact from my export files... (Backup you say? It was deleted loong ago. This sharepoint export is all we have...)

    Read the article

  • Dell 2155cdn Multifuntion Printer

    - by Wayne-Z
    Hey guys. So after a prolonged period, (overnight) i lose the ability to initiate a scan directly from the unit. I'll initiate a job and after awhile, it times out and beeps. However, when i go to the PC and initiate a scan from there, the scan job goes through just fine, and i can then initiate scans directly from the unit all day. Until a prolonged period (overnight). Then i have to do the process again. Ive made sure all powersave settings were off.. The PC doesnt go to sleep or a powersave state. Any insight into this will be appreciated. Pertinent Info: - Unit: Dell 2155cdn Multifunction - Connected via USB - PC: Dell Optiplex 755

    Read the article

  • Multiple .bkf files created in Backupexec 12.5 or 2010 related to heavy I/O?

    - by syuusuke
    Hey everyone, I was wondering if anyone who has used backupexec 12.5 or 2010 have ever experienced multiple .bkf files created for a single job. To describe what I mean by multiple files, the .bkf are being created with random file sizes under 2GB even though I've assigned the setting to chop the file after 10GB size. Some jobs will create 20x .bkf files in 1 job with file chunks ranging from 50MB to 800MB sizes. Is this is a sign of heavy I/O issues? Bandwidth limitations? I'm not sure, I'm here to seek some advices and suggestions. I've setup another backup server with the same exact settings and they seem to create a new .bkf file when 10GB limit has been reached. Although I am backing up different machines but I know my settings are an exact match to the problematic or atleast I think it's a problem.

    Read the article

  • Looking for a Linux stream ripper that can be scheduled

    - by Anthony D
    I have an MP3 stream I want to schedule a recording of. I can do it using wget to a file, its just a straight mp3 stream. However I'd like to use a command line stream ripper that will do a better job. Any one know of one? Update 1 WGET is grabbing whatever part of the stream it comes in on. This may not really be the start of a frame in the MP3 file. Also, wget is not really schedule ready. I experimented with starting it with a cron job, then killing it later, this produced a file that didn't really start and stop where I wanted.

    Read the article

  • Config deployment on multiple servers.

    - by user66601
    I have multiple servers in WEB cluster (identical configuration for all of them, despite the IP) How do you deploy changes in configs on multile servers? I make the new config, then create config per every server (placing correct IP), and next: upload them on every server, replacing old ones (rsync over ssh) set on every server a job which reloads webserver at the same time (servers use ntp). - this done by issuing commands by script (to save time for logging in) before adding a job for server reload - there's checksum test of the config on the server) - an a notification in case of fail How do you see such method? What should be the "professional way :) ? (I don't say my way doesn't work... it works and saves my time not used for logging on every webserver.) Regards,

    Read the article

  • How can I stop ntbackup requiring my new password every time I'm forced to change my Windows passwor

    - by Lunatik
    I have a scheduled job that runs each night using ntbackup which copies a folder on my HDD to a network share. The problem is that every time I'm required to change my Windows password I have to remember to change it in ntbackup aswell, otherwise the backup fails silently i.e. I get no warning that the backup isn't being done. Is there a way to schedule this job so it will automatically pick up my new Windows password, or somehow not be tied to my main login? My user account type is Debugger, not full Administrator, so I'm not sure if that would restrict me in any way, e.g. still forcing a four-weekly password change on a dedicated user account for this. The PC runs XP SP2 on a Windows Server 2003 R2 domain.

    Read the article

  • How to properly shrink a disk size of a server that is being backed up off-site?

    - by JKM
    We have a Virtual machine (lets call this one source) that is being hosted locally with a 1TB disk space (that's how big the virtual disk is) and it has been replicated remotely via Veeam to an off-site server (lets call this clone). However, there has been some server configuration changes that has made source not require as much disk space. I am contemplating shrinking the disk size of source, or using the standalone converter to create a new image with a much smaller disk size requirement (about 300GB). The reason behind this is to lessen the time required for the "Discovering replica VM" step during the replication process. My question is what happens to clone when the replication job is run? Do I need to redo the replication/set up a new backup to create an initial seed for source? Will the job automatically pick up that the disk size has shank and adjust the disk size of clone appropriately? What is the best method for accomplishing this?

    Read the article

  • How to repair a Veritas tape that has been overwritten a bit?

    - by Ismo Utriainen
    I meant to restore some files, but I forgot that there was a monday backup job just waiting for tape loading. So veritas 10d started to write over my tape and that valuable data is now gone. The original data size was about 40 GB and that accidentally started job wrote about 30 MB to the begin of tape. What are my possibilities to recover some data from that tape? Update: Inventory and catalog doesn't help, media settings are overwrite, not append. It is a DLT drive.

    Read the article

  • How to automatically print the contents of a folder in OS X?

    - by MDRoz
    I would like to be able to set up a folder on my Mac, where I could dump files and have them be automatically sent to the default printer. This way I would be able to print files at home when I'm not physically at home, using something like Dropbox. It wouldn't have to be real-time; i.e., a scheduled job that checks the folder every so often would be acceptable. What's the easiest approach? Automator? Applescript? Cron job?

    Read the article

  • How do I allow a (local) user to start/stop services with a scheduled task?

    - by Mulmoth
    Hi, on a Windows 2008 R2 server I have two small .cmd-scripts to start/stop a certain service. They look like this net start MyService and net stop MyService I want to execute these script via scheduled task, and I thought it would be best to create a local user for this job. The user is not member of the Administrators group. But the scripts fail with exit code 2. When I logon with this local user and try to execute these script in command line, I see a message like (maybe not exactly translated from german to english): Error code 5: Access denied It doesn't matter whether I start the command line as Administrator or not. How can this local user gain rights to do the job?

    Read the article

  • Remote Desktop Solution without VPN, with locked PC

    - by ujjain
    Sometimes I work from home and I use either of these 2 methods: Teamviewer VPN + Remote desktop When I connect with VPN I am however unable to browse any websites on my own computer, which can be very inconvenient when I am basically waiting an hour for somebody else to finish his job to do a 5 minute job. It would be nice if I could still continue browsing. A solution for this is Teamviewer. With Teamviewer I can manage the other computer, without suffering the restrictions of using the VPN. Everybody in the working area however, can see me using my computer remotely and this is not a good situation either, especially during work hours. I would like a solution that allows me to continue browsing normally and still control my remote workstation, without other people seeing every move I make on the workstation.

    Read the article

  • How to work around Windows error 8x90070057?

    - by Chris
    So today I was trying to copy a simple .PDF file from a local folder on my machine to a network folder and every time I tried to move the file I would get an error dialog box which would state that I was passing a wrong parameter and give me the error code of 8x90070057. Does anyone know of a way to work around this error so that I can get this file copied? I have tried renaming the file with an underscore in front. I have tried copying from my local folder to my desktop and then to the remote folder. I have tried hunting down anything that might be using the file. An example of the file name is: Flowers & Trees.pdf

    Read the article

  • rsync (or robocopy) from multiple computers - what happens?

    - by TheCleaner
    If a Linux server has two different rsync jobs nightly for the same folder to two different destinations, do both destinations end up with the same end set of files? Or does the first job run, and set something on the source folder/files that would cause the 2nd rsync job to not realize the daily changes/updates to the source? Same for a Windows environment using something like robocopy, or even a "differential" backup using BUE or similar. Does each "sync" compare the destination to the source and update the destination regardless of if it is synced multiple times to different destinations?

    Read the article

  • How to get cells to default to zero or calculate additonal fees, based on selection from a drop-down list

    - by User300479
    I am building a Pay Rate Calculator worksheet with a Flat/Base pay rate & numerous Overtime pay rates. I would like to be able to have the "Overtime" pay rate cells to change depending on my selection from my drop-down list. My list selections are "Flat Rate" and "Compounding". 1) If I select "Flat Rate" how can I make all the "Overtime" cell rates and totals default to zero or calculate to zero, to show the user there is no overtime rates to be applied to this job and to use the one rate to pay? 2)And if I select "Compounding" the Overtime rate cells are updated to add/include additional fees, to show the user Overtime rates apply to the job and penalties have automatically been calculated on top for them. Please explain like I'm a 2 year old - learning as I go. Many thanks :)

    Read the article

  • Remap control key in gnome-terminal?

    - by Colin
    I just installed Ubuntu to get more familiar with it since I'll be using it in a new job shortly. I use Macs at home and in my current job, so I'd like to make it as Mac-like as possible. I've remapped the command and control characters using the following .xmodmap: remove control = Control_L Control_R remove mod4 = Super_L Super_R add control = Super_L Super_R add mod4 = Control_L Control_R Which works great for everything except the terminal, since Ctrl-C is now mapped to CMD-C, and still conflicts with what I'd like to use to copy. Is there any way I can remap the Control key just for the terminal? I'm willing to consider gnome-terminal alternatives if required.

    Read the article

  • The simple "cron" that killed the cloud hosting option

    - by ron M.
    My SaaS application required a nightly cron job to run, analyze a database, send out e-mails and do some database maintenance work. This job cannot be triggered by user action. Almost every 'cloud' hosting solution balks at this to the point where they tell me "we cannot do this". Is this feature so exotic that cloud hosting providers simply don't care about? Am I using the wrong lingo here? should I use another concept? Do I have to go with dedicated hosting where i have "root access" as the only solution to this?

    Read the article

  • SQL Server 2005 transactional replication break before a configured number of retries

    - by ti2
    We have a SQL Server 2000 Standard database with some tables being replicated (continuous transactional replication) to dozens of SQL Server 2005 Express and MSDE computers. The step 2 of the replication agent job (Run agent) is configured by default to retry every 1 minute for 10 times if some problem ocurr. Because the client machines get shut down at night (they are POS machines), we changed the number of retries to 5760 (4 days), so replication would not be broken at night and would not need to be restarted manually. But the problem is that every other day we have at least one machine with broken replication, with this error: The process could not connect to Subscriber 'POS986'. NOTE: The step was retried the requested number of times (5760) without succeeding. The step failed. It seems that SQL Server is not respecting the number of retries or the interval between retries as we configured. PS: I have restarted the replication job after changing the number of retries from 10 to 5760.

    Read the article

< Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >