Search Results

Search found 4581 results on 184 pages for 'deviling master'.

Page 115/184 | < Previous Page | 111 112 113 114 115 116 117 118 119 120 121 122  | Next Page >

  • Simple Windows+Linux server provisioning? Chef/Puppet/Ansible etc

    - by Andrew
    I'm primarily a developer, part time devops; and manage servers here and there for my projects. I want to automate provisioning of web/app/database servers going forward for my projects I manage a mixture of both Windows and Linux servers (VPS, cloud and dedicated) I've looked at investigated Chef/Puppet/Ansible briefly; and I am wanting to find something that: Is easy to learn and understand. I don't want to invest weeks into understanding a complicated piece of tech. Ideally does not require a server ("master server") to hold the configurations Supports provisioning of Windows and Linux servers Comes with suitable documentation to get started Does anyone have any advice on what tool is best suited? Thanks

    Read the article

  • Can't Read/Write the Hard disk used in NAS

    - by mgpyone
    I've lately purchased a Synology DS212j and I intended to use my two 3.5" HDs into it. One of them was in used as an external HD. Thus when I install these two unit in NAS, it asked me to formatted in order to used with its format (I think it's ext3?) . I installed the Disks and omit the formatting option. I just got another 3.5" Hard Disk now. I've installed it in the NAS. everything's fine. However, when I take out the (used) HD from the NAS and install back in the standalone casing, I found out that it can't be read from both OSX an Windows 7. I've tried with ext2sd and I only found 2GB portion of the whole 1.5 TB Hard Disk. Here's another reference from EASEUS Partition Master

    Read the article

  • Installing the Apple Root Certificate Authority on CentOS CLI

    - by Daniel Hollands
    I could be barking up the wrong tree here, but I'm looking for help on installing Apple's Root certificate (http://www.apple.com/certificateauthority/) on a CentOS server via the command line - which I need to send messages to their APNS system. The code I'm using for this purpose is a variation on this: https://github.com/jPaolantonio/SimplePush/blob/master/simplepush.php - which works perfectly well on a Windows server, but as soon as we try to use it on a CentOS one, it falls over. We're lead to believe this has something to do with not having the CA installed on our CentOS box - but all efforts to do so have failed. As the CentOS server is headless, we need the ability to do this via the commandline. Can someone help?

    Read the article

  • Control cell reference increment when dragging a forumula in Libre Office Calc (3.5)

    - by Chuck
    Using Libre Office Calc (3.5) and have a question. When copying a formula that references cells into multiple empty cells the default is to increment each cell reference by one column or row, depending on the direction that the formula is being drug. A formula '= 1 + A1' drug horizontally changes to '= 1 + B1' when pulled one cell to right and '=1 + A2' when pulled one cell down. Is there a way to control increase the increment of the referenced cell? Is is possible to have a formula '= 1 + A1' that effectively changes to '= 1 + A3' when drug down one cell, '= 1 + A5' when drug down two cells, etc? If it matters, I am trying to take a constantly updating master list of data that is organized by dates (Wednesdays and Saturdays) and create separate spread sheets for each day of the week that can be updated by only pulling down the formula into the next cell. My attempts at using the 'lookup' function, 'offset' function, and creating a sort column in Libre Office Calc are thwarted by my inability to figure out how to get around the single step increment when pulling a formula down into the next cell. Thanks

    Read the article

  • Can expire_logs_days be less than 1 day in MySQL?

    - by Scott
    So... yesterday I received an "after the fact email" about a campaign that has started for one of the services that I run. Now the DB server is getting hammered, hard, to the tune of about 300mb/min in binary logging for the replicate. As you could imagine, this is chewing up space at a fairly tremendous rate. My normal 7 day expiry of binary logs just isn't cutting it. I've resorted to truncating logs to just the last for 4 hours with(I'm verifying that replication is up to date with mk-heartbeat): PURGE MASTER LOGS BEFORE DATE_SUB( NOW(), INTERVAL 4 HOUR); I'm just running that from cron every few hours to weather the storm, but it made me question the minimum value for expire_logs_days. I haven't come across a value that is less than 1, but that doesn't mean that it isn't possible. http://dev.mysql.com/doc/refman/5.0/en/server-system-variables.html#sysvar_expire_logs_days gives the type as being numeric, but doesn't indicate if it's expecting integers.

    Read the article

  • When to use MySQL replication or DRBD for HA on Xen VM?

    - by user62513
    I'm setting up a database which needs to be needs to provide High Availabilty. My primary concern is high performance and robustness (I don't want something that will fail fast and badly). The database is accessed by the application at an average of 300 qps. It's will run on Xen VMs and it has some InnoDB tables as well as MyISAM tables. The VMs are connected via ethernet 100Mbit/s ethernet cables. Which of the two - MySQL replication or DRBD - would you recommend in such a situation? Or should I use DRBD to make the master database Highly Available and use MySQL replication on the slaves? I'm a developer so these things are all not so easy for me to make a sound judgement.

    Read the article

  • Tool or website or process to display previews of website templates residing in archive files?

    - by Tony_Henrich
    I have hundreds of website templates in rar or zip files. To view any of them I have to extract the archive to a temporary folder and then view the template in there. It's a time consuming manual process to do this for each template Is there a tool which enables me to quickly preview the templates in the files? OR (if I extract each template into a separate folder off a master folder) A web app which can enable previewing of each template by automatically creating a link or a preview image (similar to template sites) of the home page for? OR any method to preview the templates in the fastest convenient way possible?

    Read the article

  • SSL issues with puppet agent at openSUSE

    - by Roman Grazhdan
    I have a master running at my vps, and it has a simple helloworld manifest which works fine with any ubuntu machine I have. It connects, exchanges keys and creates test file allright, so I'm sure it's not server issue. The agent which is running at a virtual machine with openSUSE says: err: Could not request certificate: SSL_connect returned=1 errno=0 state=SSLv3 read server certificate B: certificate verify failed. This is often because the time is out of sync on the server or client I believe it's probably a broken or missing lib, since the package is not built very accurately - it wouldn't start out of the box because of wrong path to lockfile, for example. So how do I figure out what exactly is wrong here? The time is allright, I've checked it. I probably could do without SSL if it's possible, since that SUSE machines are just for training, but it's the last opportunity.

    Read the article

  • Get percentage free space on database volumes w/ SQL Server 2005?

    - by Allen
    I am currently using SQL Server 2005 and (undocumented I believe) master..xp_fixeddrives to get free space on my database volumes as part of my monitoring. However, this only gives me an absolute number of MB free. What I really need is percentage free. Is there another way in SQL Server 2005 to get this? If not, is there some other light-weight way to get it? If I can, I want to avoid installing a Java JRE, or Perl, or Python on my database server. Perhaps vbscript, or a small Windows executable on the file system? Yes, I know I can Google this, and I have. It looks like there are a few ways to accomplish it, and I'm curious how my DBA brethren have handled this.

    Read the article

  • Setting Rails up on a Linode - Nginx Issue

    - by rctneil
    I am extremely new to this so please don't shoot me down: I have set up a Linode running Ubuntu, It is all sort of working except Nginx. I am following this guide: http://rubysource.com/deploying-a-rails-application/ And this for nginx: http://library.linode.com/web-servers/nginx/installation/ubuntu-10.04-lucid When I go to my IP, I get a 500 internal server error. I have tried starting nginx and it looks like it starts fine. I run this: ps awx | grep nginx and I get: 308 ? Ss 0:00 nginx: master process /usr/sbin/nginx 2309 ? S 0:00 nginx: worker process 2311 ? S 0:00 nginx: worker process 2312 ? S 0:00 nginx: worker process 2313 ? S 0:00 nginx: worker process 2850 pts/0 S+ 0:00 grep --color=auto nginx I really am not sure what else to do to get it running. Any help? Neil

    Read the article

  • Can GnomeKeyring store passwords unencrypted?

    - by antimeme
    I have a Fedora 15 laptop with the root and home partitions encrypted using LUKS. When it boots I have to enter a pass phrase to unlock the master key, so I have it configured to automatically log me in to my account. However, GnomeKeyring remains locked, so I have to enter another pass phrase for that. This is unpleasant and completely pointless since the entire disk is encrypted. I've not been able to find a way to configure GnomeKeyring to store its pass phrases without encryption. For example, I was not able to find an answer here: http://library.gnome.org/users/seahorse-plugins/stable/index.html.en Is there a solution? If not, is there a mailing list where it would be appropriate to plead my case?

    Read the article

  • What could be causing LVM errors on first boot after install in Debian?

    - by ianfuture
    Hi, I've installed Debian (lenny) on a machine at home. It was set up during install to have a /boot partition, then the rest was encrypted, then had an LVM ontop of that, then all the other partitons inside LVM. After install completed and on first boot it asked for password to un-encrypt(same password for both drives) then it showed an error which said LVM could not find a physical device with a particular UUID or something similar. LVM install is over two HDs. One is 120GB and one 40GB. 120GB is Master on its IDE cable and this has /boot on it. 40GB is slave on the other IDE cable. Is there anything that could be done to rescue this install? Or diagnose problem? It took ages to get installed due to time spent enrypting drives and I'd rather not go through that again. :( Thanks.. Ian

    Read the article

  • Filter tagged threads in Thunderbird

    - by Let_Me_Be
    I have a big issue with Thunderbird, I need to process a lot of emails coming from request tracking system. Since only few of those apply to me personally, I tag these threads with appropriate tags. The issue now is that I would like to filter out threads, that do apply to me and those that I haven't tagged yet. I'm unable to do this, because new emails keep arriving into the already existing threads, and of course, these new emails don't get the tags. Basically I would need some sort of filtering rule, that would apply not to a specific message, but the master message in the thread. Is there some possibility to create such filters, or is there some other facility, that would allow me to do the same? Tags are great, since they are actually saved into the messages and correctly sync across multiple machines when using IMAP.

    Read the article

  • Options to efficiently synchronize 1 million files with remote servers?

    - by Zilvinas
    At a company I work for we have such a thing called "playlists" which are small files ~100-300 bytes each. There's about a million of them. About 100,000 of them get changed every hour. These playlists need to be uploaded to 10 other remote servers on different continents every hour and it needs to happen quick in under 2 mins ideally. It's very important that files that are deleted on the master are also deleted on all the replicas. We currently use Linux for our infrastructure. I was thinking about trying rsync with the -W option to copy whole files without comparing contents. I haven't tried it yet but maybe people who have more experience with rsync could tell me if it's a viable option? What other options are worth considering?

    Read the article

  • Duplicating keepass files instead of creating a new file

    - by BlakBat
    I'm currently using KeePass 2 and syncing them via dropbox. I have a few KeePass files (one for websites, one to store software licenses, etc...) Every time I need a new KeePass file, I just create a copy of the kbdx file, open it, remove all existing entries, change the key transformation rounds to another pseudo-random value. I do not change the master password. I want to know if this was unsafe practice, or was a security risk, compared to just creating a new KeePass file via the "File-New" menu. The reason I don't use the menu: i'm lazy enough to not want to reconfigure "database settings" every time.

    Read the article

  • How to let Linux Python application handle termination on user logout correctly?

    - by tuxpoldo
    I have written a Linux GUI application in Python that needs to do some cleanup tasks before being terminated when the user logs out. Unfortunately it seems, that on logout, all applications are killed. I tried both to handle POSIX signals and DBUS notifications, but nothing worked. Any idea what I could have made wrong? On application startup I register some termination handlers: # create graceful shutdown mechanisms signal.signal(signal.SIGTERM, self.on_signal_term) self.bus = dbus.SessionBus() self.bus.call_on_disconnection(self.on_session_disconnect) When the user logs out, neither self.on_signal_term nor self.on_session_disconnect are called. The problem occurs in several scenarios: Ubuntu 14.04 with Unity, Debian Wheezy with Gnome. Full code: https://github.com/tuxpoldo/btsync-deb/tree/master/btsync-gui

    Read the article

  • Reducing volume of an audio device on windows 7

    - by bdonlan
    I have a USB headset with a very loud amplifier, but low granularity in its gain control. In order to get comfortable audio, I have to reduce the individual application levels in the mixer to '1', and the master mixer to around '10'. Of course, new applications start out at '10', and immediately blast out my ears. Is there a way to add a filter to cut down the volume some so I can get better control of it? That is, reduce the volume of '100' so I can work within a reasonable range.

    Read the article

  • Can you remap "C:\Program Files" like you can with "My Documents"?

    - by Danny
    I'm not sure if this is possible, but I'm hoping you guys will know one way or the other! I'm going to be reinstalling windows xp, and the primary master IDE is a smaller 10 gig drive. I'm pretty sure that if I tried to install all my programs back onto the C:\ drive that they'd not all fit. Is it possible to get my Program Files directory to point to a partition on one of my larger drives, so I don't end up with some of my programs on C:\ and others on D:\, E:\, etc?

    Read the article

  • Postfix cleanup daemon access control

    - by Flimzy
    Is there any way to control which hosts are permitted to connect to the cleanup daemon over TCP? Our 'master.cf' contains: 2526 inet n - - - 0 cleanup This is necessary because we have a cluster of SMTP servers running custom code, and they can all inject mail to the centralized postfix server via the cleanup daemon. However, we want to allow only our authorized servers to connect to the cleanup daemon. The current configuration allows any host to connect to port 2526. Clearly we can use iptables to restrict access, but is there a way to do this within postfix itself?

    Read the article

  • puppet onlyif specified nodes

    - by Valintinr
    I'm trying to write a puppet template. I have a puppet-master and a few puppet-agents and they all must be divided. I think it's good to do this by the node's hostname. But when I tried to do this I've encountered an error "puppet-agent[169037]: (/Stage[main]//Exec[adduser]) Could not evaluate: Could not find command 'ru1'" see code below exec { 'adduser': command => 'sudo adduser -m -p pawSfQewWrUAA test -G wheel', path => [ '/bin','/usr/bin' ], onlyif => "$hostname == ru1" } I need to specify this task for only one node with the hostname ru1. So have can I do this? Thanks.

    Read the article

  • How to sync apps with one Mac and 3 devices?

    - by openfrog
    We have 1 Mac, with 1 iTunes account, and 3 iOS devices. Every time we sync one device, iTunes either spams it with ALL apps from ALL devices, OR it removes all of the apps from the device including all the data. Which is very annoying. Someone told me there is a way to tell iTunes to keep separate track of the devices. How can I setup iTunes such that it will not transfer all my iPhone-only apps to my iPad every time I sync? Basically I want the devices to be the "master". They give the direction which apps should be on the device, and not the other way around.

    Read the article

  • Git: get back my commited data from a messed up local repo

    - by Karussell
    I am a newbie to git so I think I made something stupid (will move back to hg soon ;-)) Assume I'm at version A and I commited a change but didn't want that. Now assume we have version B. I didn't found a good solution how to cleanly roll B back to A but went back to A via checkout and continued commiting - assume I have version C. A--->B \-->C Now the problem is that those commits were successfull (I can see the SHA's and the msg in .git/logs/HEAD) but the commits do not show up in the log and I couldn't push them to github. Before detecting the mess I checkout to 'master' ... and git rolled all back to B. How can I get my version C back or are my changes lost? Is this the same problem as described here? Please close. I posted it here. On stackoverflow seems to be more questions related to that ...

    Read the article

  • Data transfer is extrem slow after partitioning extern usb drive

    - by user125912
    I bought an extern usb 3.0 drive with 500 gb capacity. OS is Windows 7. I use it with an usb 2.0 slot, no prob. Initially I used it without making several partitions and it was fast as hell. Then I had the great idea to make partitions, one for programs, one for data and one for backup. I chose the free EASEUS Partition Master 9.1.1. and ended up with these partitions: F:Apps, primary, NTFS, 100gb H:Data, logic, NTFS, 250gb B:Backup, logic, NTFS, 150gb THE PROBLEM: When I copy files from C: to F: I get a transfer rate of about 100 KB/S ! When I copy files from C: to H: I get a transfer rate of about 4 MB/S ! thats all muuuch to slow, slower then before. What can I do to speed the shit up? Thanks in advance!

    Read the article

  • Converting massive images to PDF, without crashing applications

    - by BloodyIron
    I'm trying to work with a large-format scanner, and we are scanning very long documents. Example, one of our documents we cut into two pieces, and one of those pieces is 3633x82486 in resolution. My application, Scanning Master 21+, which comes with the device (Graphtec CSX300-09) can output PDF, however when I try to save to PDF it complains about file being too large. I can successfully output to BMP however. GIMP can even open this BMP, after taking a while to load it. The resulting files range from 200MB - 1.2GB in size. Acrobat refuses to open the BMP format, saying it isn't supported or is damaged (which I know is not true). As I mentioned, the PDF plugin for GIMP crashes when I try to export to PDF. I'm really not sure what is the best tool for this job. So what is the best tool to produce PDF documents of very large images?

    Read the article

  • Sql database dumps failing every night

    - by chaseman36
    Hey guys, I have sql05 and my maintenance plan which backs up a database to an external storage SAN, has been failing every night. Here is my error: Executing the query "BACKUP DATABASE [master] TO DISK = N'\\192.168.x.x\vmbackup\server\dbbackup\master_backup_201004222300.bak' WITH NOFORMAT, NOINIT, NAME = N'master_backup_20100422230002', SKIP, REWIND, NOUNLOAD, STATS = 10 " failed with the following error: "Cannot open backup device '\\192.168.x.x\vmbackup\server\dbbackup\master_backup_201004222300.bak'. Operating system error 5(Access is denied.). BACKUP DATABASE is terminating abnormally.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly. I googled this error and tried adding permissions to the backup device for network service as recommended at experts exchange, no dice. Does anyone have any ideas?

    Read the article

< Previous Page | 111 112 113 114 115 116 117 118 119 120 121 122  | Next Page >