Search Results

Search found 4705 results on 189 pages for 'export to csv'.

Page 54/189 | < Previous Page | 50 51 52 53 54 55 56 57 58 59 60 61  | Next Page >

  • Rendering scaled-down card images

    - by user1065145
    I have high-quality SVG card images, but they drastically lose their quality when I downsize them. I have tried two ways of rendering cards (using Inkscape and Imagemagics): 1) Render SVG to high-res PNG and resize it then: inkscape -D --export-png=QS1024.png --export-width=1024 QS.svg convert QS1024.png -filter Lanczos -sampling-factor 1x1 -resize 71x QS71.png 2) Render SVG to image of proper size at once: inkscape -D --export-png=QS71.png --export-width=71 QS.svg Both approaches generate blurry card images, which looks even worse than old Windows cards. What are the best way to generate smaller card images from SVG sources and not to loose their quality a lot? UPDATE: I am using Inkscape to render SVG - PNG and ImageMagick then to downsize PNG. I've tried using convert -resize with couple of filters (Lanczos/Mitchell/etc), but result was pretty much the same. Original: 71x raster:

    Read the article

  • Can I "export" an alias to the SHELL that invoked a script?

    - by RonK
    I'm trying to write a utility script that defines certain aliases. My SHELL is tcsh (can't change that). I tried the following #!/bin/tcsh alias log 'less ~/logs/log.date '+%Y%m%d''' Then I run it like this: ./myscript log The output I get is: log: Command not found. Naturally if I run it like this: source myscript log Everything is fine. Any way to do it without specifying source ...?

    Read the article

  • Building a Solaris 11 repository without network connection

    - by user12611852
    Solaris 11 has been released and is a fantastic new iteration of Oracle's rock solid, enterprise operating system.  One of the great new features is the repository based Image Packaging system.  IPS not only introduces new cloud based package installation services, it is also integrated with our zones, boot environment and ZFS file systems to provide a safe, easy and fast way to perform system updates. My customers typically don't have network access and, in fact, can't connect to any network until they have "Authority to connect."  It's useful, however, to build up a Solaris 11 system with additional software using the new Image Packaging System and locally stored repository. The Solaris 11 documentation describes how to create a locally stored repository with full explanations of what the commands do. I'm simply providing the quick and dirty steps.  The easiest way is to download the ISO image, burn to a DVD and insert into your DVD drive.  Then as root: pkg set-publisher -G '*' -g file:///cdrom/sol11repo_full/repo solaris Now you can to install software using the GUI package manager or the pkg commands.  If you would like something more permanent (or don't have a DVD drive), however, it takes a little more work. After installing Solaris 11, download (on another system perhaps) the two files that make up the Solaris 11 repository from our download site Sneaker-net the files to your Solaris 11 system Unzip and cat the two files together to create one large ISO image. The file is about 6.9 GB in size zfs create rpool/export/repoSolaris11 zfs set atime=off rpool/export/repoSolaris11 zfs set compression=on rpool/export/repoSolaris11 (save some space) lofiadm -a sol-11-1111-repo-full.iso /dev/lofi/1 mount -F hsfs /dev/lofi/1 /mnt You could stop here and set the publisher to point to the /mnt/repo location, however, this mount will not be persistent across reboots. Copy the repository from the mounted ISO image to a permanent, on disk location. rsync -aP /mnt/repo /export/repoSolaris11 pkgrepo -s /export/repoSolaris11 refresh pkg set-publisher -G '*' -g /export/repoSolaris11/repo solaris You now have a locally installed repository for adding additional software packages for Solaris 11.  The documentation also takes you through publishing your repository on the network so that others can access it.

    Read the article

  • Moving Away from Exchange public folders - Export to file system folder?

    - by Mr. Monkey
    I have a public folder that was used for the wrong reason. Due to some regulations we had to store lots of photos, we're talking at least 7000 photos that are stored based on a location of stores. So for example, each store would send in an email with at least 2 photos of their location, that email would contain their location name or number, and those photos, so there was some sort of organization for it. I would love to move the contents of that public folder to a normal windows folder we could share on a server. Is anything like that possible? Anybody have other ideas?

    Read the article

  • How to take a CSS animation from a browser, and export a GIF of it?

    - by Truth
    I have the following CSS3 Animation going on: http://dabblet.com/gist/2884702. It's basically a simulation of a mirror rotating on its x-axis. Now, I wish to present that in a PowerPoint presentation. Since PowerPoint doesn't have the webkit engine, I want to extract an animated GIF image of that animation, and embed it into my presentation. The problem? No matter what I've tried, I couldn't make a reasonably smooth animated GIF. I've Googled and found many free software which claim to do the job, tried several, none worked as expected. I've tried IrfanView, same issue (also their site makes me want to vomit). So, is there a solution? Or am I doomed to not be able to display it?

    Read the article

  • deb package building

    - by newcode
    While trying to build a package, I gave the following command in terminal: cd Downloads/src/ cd unity-5.10.0/ dpkg-buildpackage -rfakeroot -uc -b Then it gives the output: dpkg-buildpackage: export CFLAGS from dpkg-buildflags (origin: vendor): -g -O2 -fstack-protector --param=ssp-buffer-size=4 -Wformat -Wformat-security dpkg-buildpackage: export CPPFLAGS from dpkg-buildflags (origin: vendor): -D_FORTIFY_SOURCE=2 dpkg-buildpackage: export CXXFLAGS from dpkg-buildflags (origin: vendor): -g -O2 -fstack-protector --param=ssp-buffer-size=4 -Wformat -Wformat-security dpkg-buildpackage: export FFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export LDFLAGS from dpkg-buildflags (origin: vendor): -Wl,-Bsymbolic-functions -Wl,-z,relro dpkg-buildpackage: source package unity dpkg-buildpackage: source version 5.10.0-0ubuntu6 dpkg-buildpackage: source changed by Didier Roche <[email protected]> dpkg-buildpackage: host architecture i386 dpkg-source --before-build unity-5.10.0 dpkg-checkbuilddeps: Unmet build dependencies: libutouch-grail-dev (>= 1.0.20) libutouch-geis-dev (>= 2.0.10) dpkg-buildpackage: warning: Build dependencies/conflicts unsatisfied; aborting. dpkg-buildpackage: warning: (Use -d flag to override.) Then I tried to install the package using: cd.. sudo dpkg -i *deb And it gives: [sudo] password for harshnarang8: dpkg: error processing *deb (--install): cannot access archive: No such file or directory Errors were encountered while processing: *deb What is exactly causing the problem and how to encounter it?

    Read the article

  • how to know which display number for the variable DISPLAY to be exported when ssh to server

    - by insidepower
    When i ssh to server using -X, i always confuse about which display number i should export. It seems to me sometimes the display number has been used by something, so what i can do is only export DISPLAY=localhost:0 && xclock export DISPLAY=localhost:1 && xclock export DISPLAY=localhost:2 && xclock export DISPLAY=localhost:... until the clock appear. Then i will use that display number. Each time log in to the server, the display number which is able to tunnel the gui data correct would be different. I know many of such similar questions has been asked and answer. However I couldn't find answer to my question, anyone know about it? Thanks!

    Read the article

  • [How-to] Make a working xml file in php (encoding properly) to export data from online database to i

    - by gotye
    Hey guys, I am trying to make an xml file to export my data from my database to my iphone. Each time I create a new post, I need to execute a php file to create a xml file containing the latest post ;) Okay so far ? :D Here is current php code ... but my nsxmlparser gives me an error code (33 - String is not started). I have no idea what kind of php functions I have to use ... <?php // Èdition du dÈbut du fichier XML $xml .= '<?xml version=\"1.0\" encoding=\"UTF-8\"?>'; $sml .= '<channel>'; $xml .= '<title>Infonul</title>'; $xml .= '<link>aaa</link>'; $xml .= '<description>aaa</description>'; // connexion a la base (mettre ‡ jour le mdp) $connect = mysql_connect('...-12','...','...'); /* selection de la base de donnÈe mysql */ mysql_select_db('...'); // selection des 20 derniËres news $res=mysql_query("SELECT u.display_name as author,p.post_date as date,p.comment_count as commentCount, p.post_content as content,p.post_title as title FROM wp_posts p, wp_users u WHERE p.post_status = 'publish' and p.post_type = 'post' and p.post_author = u.id ORDER BY p.id DESC LIMIT 0,20"); // extraction des informations et ajout au contenu while($tab=mysql_fetch_array($res)){ $title=$tab[title]; $author=$tab[author]; $content=$tab[content]; //html stuff $commentCount=$tab[commentCount]; $date=$tab[date]; $xml .= '<item>'; $xml .= '<title>'.$title.'</title>'; $xml .= '<content><![CDATA['.$content.']]></content>'; $xml .= '<date>'.$date.'</date>'; $xml .= '<author>'.$author.'</author>'; $xml .= '<commentCount>'.$commentCount.'</commentCount>'; $xml .= '</item>'; } // Èdition de la fin du fichier XML $xml .= '</channel>'; $xml = utf8_encode($xml); echo $xml; // Ècriture dans le fichier if ($fp = fopen("20news.xml",'w')) { fputs($fp,$xml); fclose($fp); } //mysql_close(); ?> I noticed a few things when i open 20news.xml in my browser : I got squares instead of single quotes ... I can't see <[CDATA[ but ]] is clearly visible ... why ?!? Thanks for any input ;) Gotye.

    Read the article

  • A new MEF error I've not seen before -- "The export is not assignable to type..."

    - by Dave
    I was very surprised to get this error today, as it's one that I've never encountered before. Everything in the code looked okay, so I did some searches. The previous questions and their respective answers didn't help. This one was solved when the poster made sure his assembly references were consistent. I don't have this issue right now because I'm currently referencing another project in my solution. This one was solved when the poster was instructed to use ImportMany, but I am already using it (I think properly, too) to try to load multiple plugins This one was solved when the poster realized that there was a platform target mismatch. I've already gone through my projects to ensure that everything targets x86. So here's what I am trying to do. I have a plugin that owns a connection to a device. I might also need to be able to share that connection with another plugin. I decided that the cleanest way to do this was to create an interface that would allow the slave plugin to request its own connection to the device. Let's just call it IConnectionSharer. If the slave plugin does not need to borrow this connection and has its own, then it should use its own implementation of IConnectionSharer to connect to the device. My "master" plugin (the one that owns the connection to the device) implements IConnectionSharer. It also exports this via ExportAttribute. My "slave" plugin assembly defines a class that also implements and exports IConnectionSharer. When the application loads, the intent is for my slave plugin, via MEF, to enumerate all IConnectionSharers and store them in an IEnumerable<IConnectionSharer>. It does so like this: [ImportMany] public IEnumerable<IConnectionSharer> AllSharedConnections { get; set; } But during part composition, I get the error the export 'Company.MasterPlugin (ContractName="IConnectionSharer")' is not assignable to type 'IConnectionSharer'. The error message itself seems clear enough -- it's as if MEF thinks my master plugin doesn't inherit from IConnectionSharer... but it does! Can anyone suggest further debugging strategies? I'm going to start the painful process of single stepping through the MEF source.

    Read the article

  • How to make this .htaccess rule case insensitive?

    - by alex
    This is a rule in my .htaccess # those CSV files are under the DOCROOT ... so let's hide 'em <FilesMatch "\.CSV$"> Order Allow,Deny Deny from all </FilesMatch> I've noticed however that if there is a file with a lowercase or mixed case extension of CSV, it will be ignored by the rule and displayed. How do I make this case insensitive? I hope it doesn't come down to "\.(?:CSV|csv)$" (which I'm not sure would even work, and doesn't cover all bases) Note: The files are under the docroot, and are uploaded automatically there by a 3rd party service, so I'd prefer to implement a rule my end instead of bothering them. Had I set this site up though, I'd go for above the docroot. Thanks

    Read the article

  • Java applet - access denied to file on same web server

    - by me_here
    I've written a simple Java applet to generate a technical image based upon some data in a CSV file. I'm passing in the CSV file as a parameter to the applet: <applet code = "assaymap.AssayMapApplet" archive = "http://localhost/applet_test/AssayMap.jar" height="600px" width="800px"> <param name="csvFile" value="http://localhost/applet_test/test.csv"> </applet> As far as I understood applet security restrictions, an applet should be able to read data from the host they're on. These applets here http://www.jalview.org/examples/applets.html are using the same approach of passing in a text data file as a parameter. So I'm not sure why my own applet isn't working. The error message I get thrown is: Error with processing the CSV data. java.security.AccessControlException: access denied (java.io.FilePermission http:\localhost\applet_test\test.csv read)

    Read the article

  • MySQL INTO OUTFILE overide existing file?

    - by Derek Organ
    I've written a big sql script that creates a CSV file. I want to call a cronjob every night to create a fresh CSV file and have it available on the website. Say for example I'm store my file in '/home/sites/example.com/www/files/backup.csv' and my SQL is SELECT * INTO OUTFILE '/home/sites/example.com/www/files/backup.csv' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\n' FROM ( .... MySQL gives me an error when the file already exists File '/home/sites/example.com/www/files/backup.csv' already exists Is there a way to make MySQL overwrite the file? I could have PHP detect if the file exists and delete it before creating it again but it would be more succinct if I can do it directly in MySQL.

    Read the article

  • How can I get around MySQL Errcode 13 with SELECT INTO OUTFILE?

    - by Ryan Olson
    but I am trying to dump the contents of a table to a csv file using a MySQL SELECT INTO OUTFILE statement. If I do: SELECT column1, column2 INTO OUTFILE 'outfile.csv' FIELDS TERMINATED BY ',' FROM table_name; outfile.csv will be created on the server in the same directory this database's files are stored in. However, when I change my query to: SELECT column1, column2 INTO OUTFILE '/data/outfile.csv' FIELDS TERMINATED BY ',' FROM table_name; I get: ERROR 1 (HY000): Can't create/write to file '/data/outfile.csv' (Errcode: 13) Errcode 13 is a permissions error, even if I change ownership of /data to mysql:mysql and give it 777 permissions. MySQL is running as user "mysql". Strangely, I can create the file in /tmp, just not in any other directory I've tried, even with permissions set such that user mysql should be able to write to the directory. This is MySQL 5.0.75 running on Ubuntu.

    Read the article

  • php/dos : How do you parse a regedit export file?

    - by phill
    My objective is to look for Company key-value in the registry hive and then pull the corresponding Guid and other keys and values following it. So I figured i would run the regedit export command and then parse the file with php for the keys I need. So after running the dos batch command >regedit /E "output.txt" "HKLM\System....\Company1" The output textfile seems to be in some kind of UNICODE format which isn't regex friendly. I'm using php to parse the file and pull the keys. Here is the php code i'm using to parse the file <?php $regfile = "output.txt"; $handle = fopen ("c:\\\\" . $regfile,"r"); //echo "handle: " . $file . "<br>"; $row = 1; while ((($data = fgets($handle, 1024)) !== FALSE) ) { $num = count($data); echo "$num fields in line $row: \n"; $reg_section = $data; //$reg_section = "[HKEY_LOCAL_MACHINE\SOFTWARE\TECHNOLOGIES\MEDIUS\CONFIG MANAGER\SYSTEM\COMPANIES\RECORD11]"; $pattern = "/^(\[HKEY_LOCAL_MACHINE\\\SOFTWARE\\\TECHNOLOGIES\\\MEDIUS\\\CONFIG MANAGER\\\SYSTEM\\\COMPANIES\\\RECORD(\d+)\])$/"; if ( preg_match($pattern, $reg_section )) { echo "<font color=red>Found</font><br>"; } else { echo "not found<br>"; echo $data . "<br>"; } $row++; } //end while fclose($handle); ?> and the output looks like this.... 1 fields in line 1: not found ÿþW?i?n?d?o?w?s? ?R?e?g?i?s?t?r?y? ?E?d?i?t?o?r? ?V?e?r?s?i?o?n? ?5?.?0?0? ? 1 fields in line 2: not found 1 fields in line 3: not found [?H?K?E?Y??L?O?C?A?L??M?A?C?H?I?N?E?\?S?O?F?T?W?A?R?E?\?I?N?T?E?R?S?T?A?R? ?T?E?C?H?N?O?L?O?G?I?E?S?\?X?M?E?D?I?U?S?\?C?O?N?F?I?G? ?M?A?N?A?G?E?R?\?S?Y?S?T?E?M?\?C?O?M?P?A?N?I?E?S?]? ? 1 fields in line 4: not found "?N?e?x?t? ?R?e?c?o?r?d? ?I?D?"?=?"?4?1?"? ? 1 fields in line 5: not found Any ideas how to approach this? thanks in advance

    Read the article

  • error executing aapt, all of the sudden

    - by pjv
    I know there are a lot of these topics around but none seem to help in my case, nor describe it exactly. The best similar one is aapt not found under the right path. My problem is that I can be using Eclipse for a whole evening programming, compiling and using my device, and then suddenly I get "error executing aapt" for my current project and ofcourse R.java isn't (properly) generated anymore. I then restart Eclipse and everything goes away. I'm seeing this once a day on average however. I've recently switched to amd64 and installed the latest Android-2.3 SDK and matching tools. I know that there is now a platform-tools folder that has an aapt version that should work SDK version independently. At first I had added this directory to my PATH, as instructed on the SDK website. I've also tried not adding it to my path and making a link platforms/android-9/tools so that every SDK version might use it's own old copy. Needless to say, platform-tools/aapt is there and has the right permissions, and I have been able to execute it on the command-line at any time. When I do write a faulty xml file or sorts, and appropriately get an error, I see an extra line that says "aapt: /lib32/libz.so.1: no version information available". I'm running a recent Gentoo linux system. I have everything installed to support x86 on amd64, but have re-emerged emul-linux-x86-baselibs and zlib just to be sure. The problem persists. I do see some pages that spell horror over some zlib bugs, but I'm not sure if that's related. I realize I'm not on the reference Ubuntu platform, but surely the difference cannot be that great? It might very well be a bug in aapt or the tools itself. Why would it suddenly stop working? I also experience that the ids in R.java were incorrect, namely that simple findViewById() code would give ClassCastExceptions because of mixed ids the one time, and then work perfectly without any changes bu only a "clean project", in the aftermath of a failing aapt. Finally, I've run a few commands on aapt, that don't seem to add any extra information: #ldd aapt ./aapt: /lib32/libz.so.1: no version information available (required by ./aapt) linux-gate.so.1 => (0xffffe000) librt.so.1 => /lib32/librt.so.1 (0x4f864000) libpthread.so.0 => /lib32/libpthread.so.0 (0x4f849000) libz.so.1 => /lib32/libz.so.1 (0xf7707000) libstdc++.so.6 => /usr/lib/gcc/x86_64-pc-linux-gnu/4.4.4/32/libstdc++.so.6 (0x415e9000) libm.so.6 => /lib32/libm.so.6 (0x4f876000) libgcc_s.so.1 => /lib32/libgcc_s.so.1 (0x4fac6000) libc.so.6 => /lib32/libc.so.6 (0x4f5ed000) /lib/ld-linux.so.2 (0x4f5ca000) #file aapt aapt: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.15, not stripped Can anybody tell anything wrong with my configuration? Does it smell like a bug perhaps (otherwise let's report it (again))? Update 2010-01-06: I've gained some more knowledge. When I recently was trying to export a signed apk, I ran into another error message (full details from the Eclipse error view) regarding aapt I hadn't seen before. Note here too, that I can just restart Eclipse and can export apks again without problems, at least for a little while. I'm starting to think it is related to lack of memory on my system. The message "onvoldoende geheugen beschikbaar" means "insufficient memory available". I have also been seeing insufficient memory errors in DDMS when I'm dumping HPROF files. Here is the error log (shortened): !ENTRY com.android.ide.eclipse.adt 4 0 2011-01-05 23:11:16.097 !MESSAGE Export Wizard Error !STACK 1 org.eclipse.core.runtime.CoreException: Failed to export application at com.android.ide.eclipse.adt.internal.project.ExportHelper.exportReleaseApk(Unknown Source) at com.android.ide.eclipse.adt.internal.wizards.export.ExportWizard.doExport(Unknown Source) at com.android.ide.eclipse.adt.internal.wizards.export.ExportWizard.access$0(Unknown Source) at com.android.ide.eclipse.adt.internal.wizards.export.ExportWizard$1.run(Unknown Source) at org.eclipse.jface.operation.ModalContext$ModalContextThread.run(ModalContext.java:121) Caused by: com.android.ide.eclipse.adt.internal.build.AaptExecException: Error executing aapt. Please check aapt is present at /opt/android-sdk/android-sdk-linux_x86-1.6_r1/platform-tools/aapt at com.android.ide.eclipse.adt.internal.build.BuildHelper.executeAapt(Unknown Source) at com.android.ide.eclipse.adt.internal.build.BuildHelper.packageResources(Unknown Source) ... 5 more Caused by: java.io.IOException: Cannot run program "/opt/android-sdk/android-sdk-linux_x86-1.6_r1/platform-tools/aapt": java.io.IOException: error=12, Onvoldoende geheugen beschikbaar ... Caused by: java.io.IOException: java.io.IOException: error=12, Onvoldoende geheugen beschikbaar ... !SUBENTRY 1 com.android.ide.eclipse.adt 4 0 2011-01-05 23:11:16.098 !MESSAGE Failed to export application !STACK 0 com.android.ide.eclipse.adt.internal.build.AaptExecException: Error executing aapt. Please check aapt is present at /opt/android-sdk/android-sdk-linux_x86-1.6_r1/platform-tools/aapt at com.android.ide.eclipse.adt.internal.build.BuildHelper.executeAapt(Unknown Source) at com.android.ide.eclipse.adt.internal.build.BuildHelper.packageResources(Unknown Source) at com.android.ide.eclipse.adt.internal.project.ExportHelper.exportReleaseApk(Unknown Source) at com.android.ide.eclipse.adt.internal.wizards.export.ExportWizard.doExport(Unknown Source) at com.android.ide.eclipse.adt.internal.wizards.export.ExportWizard.access$0(Unknown Source) at com.android.ide.eclipse.adt.internal.wizards.export.ExportWizard$1.run(Unknown Source) at org.eclipse.jface.operation.ModalContext$ModalContextThread.run(ModalContext.java:121) Caused by: java.io.IOException: Cannot run program "/opt/android-sdk/android-sdk-linux_x86-1.6_r1/platform-tools/aapt": java.io.IOException: error=12, Onvoldoende geheugen beschikbaar ... Caused by: java.io.IOException: java.io.IOException: error=12, Onvoldoende geheugen beschikbaar

    Read the article

  • MaxStartups and MaxSessions configurations parameter for ssh connections?

    - by Webby
    I am copying the files from machineB and machineC into machineA as I am running my below shell script on machineA. If the files is not there in machineB then it should be there in machineC for sure so I will try copying the files from machineB first, if it is not there in machineB then I will try copying the same files from machineC. I am copying the files in parallel using GNU Parallel library and it is working fine. Currently I am copying 10 files in parallel. Below is my shell script which I have - #!/bin/bash export PRIMARY=/test01/primary export SECONDARY=/test02/secondary readonly FILERS_LOCATION=(machineB machineC) export FILERS_LOCATION_1=${FILERS_LOCATION[0]} export FILERS_LOCATION_2=${FILERS_LOCATION[1]} PRIMARY_PARTITION=(550 274 2 546 278) # this will have more file numbers SECONDARY_PARTITION=(1643 1103 1372 1096 1369 1568) # this will have more file numbers export dir3=/testing/snapshot/20140103 find "$PRIMARY" -mindepth 1 -delete find "$SECONDARY" -mindepth 1 -delete do_Copy() { el=$1 PRIMSEC=$2 scp david@$FILERS_LOCATION_1:$dir3/new_weekly_2014_"$el"_200003_5.data $PRIMSEC/. || scp david@$FILERS_LOCATION_2:$dir3/new_weekly_2014_"$el"_200003_5.data $PRIMSEC/. } export -f do_Copy parallel --retries 10 -j 10 do_Copy {} $PRIMARY ::: "${PRIMARY_PARTITION[@]}" & parallel --retries 10 -j 10 do_Copy {} $SECONDARY ::: "${SECONDARY_PARTITION[@]}" & wait echo "All files copied." Problem Statement:- With the above script at some point I am getting this exception - ssh_exchange_identification: Connection closed by remote host ssh_exchange_identification: Connection closed by remote host ssh_exchange_identification: Connection closed by remote host And I guess the error is typically caused by too many ssh/scp starting at the same time. That leads me to believe /etc/ssh/sshd_config:MaxStartups and MaxSessions is set too low. But my question is on which server it is pretty low? machineB and machineC or machineA? And on what machines I need to increase the number? On machineA this is what I can find - root@machineA:/home/david# grep MaxStartups /etc/ssh/sshd_config #MaxStartups 10:30:60 root@machineA:/home/david# grep MaxSessions /etc/ssh/sshd_config And on machineB and machineC this is what I can find - [root@machineB ~]$ grep MaxStartups /etc/ssh/sshd_config #MaxStartups 10 [root@machineB ~]$ grep MaxSessions /etc/ssh/sshd_config #MaxSessions 10

    Read the article

  • Server high memory usage at same time every day

    - by Sam Parmenter
    Right, we moved one of our main sites onto a new AWS box with plenty of grunt as it would allow us more control that we had before and future proof ourselves. About a month ago we started running into issues with high memory usage at the same time every day. In the morning an export is run to export data to a file which is the FTPed to a local machine for processing. The issues were co-inciding with the rough time of the export but when we didn't run the export one day, the server still ran into the same issues. The export has been run at other times in the day since to monitor memory usage to see if it spikes. The conclusion is that the export is fine and barely touches the sides memory wise. No noticeable change in memory usage. When the issue happens, its effect is to kill mysql and require us to restart the process. We think it might be a mysql memory issue, but might just be that mysql is just the first to feel it. Looking at the logs there is no particular query run before the memory usage hits 90%. When it strikes at about 9:20am, the memory usage spikes from a near constant 25% to 98% and very quickly kills mysql to save itself. It usually takes about 3-4 minutes to die. There are no cron jobs running at that time of the day and we haven't noticed a spike in traffic over the period of the issues. Any help would be massively appreciated! thanks.

    Read the article

  • exporting clip in Final Cut Pro X or related video editing software on Mac

    - by user46976
    I'm using Final Cut Pro X to edit a 1 hour long video. I made individual clips from it in Final Cut Pro X and I want to save just these clips, some of which are only 5 mins long. How can I do this? I tried using the app ClipExporter, but it won't even read my .fcpxml file, it just says that it's not a valid file and gives no helpful information at all. Another method I tried was to assign roles to each clip. I made one clip, 5 mins long, and then used Share - Export in Final Cut Pro X and chose the option to export roles as separate files. However, the export still estimates that it will take over an hour to export and so it looks like it's trying to export the whole movie, rather than the simple 5 min clip which should be exportable as a .MOV or related formats in a few minutes. How can I do this in final cut pro x? I'm also happy to switch to related video editing software as long as they are not extremely expensive. This seems like a very trivial and obvious feature: take a segment from a long movie and export just the selected region of it... I don't understand why it's so complicated to do in Final Cut Pro X. Thanks.

    Read the article

  • Java 1.4 singleton containing a mutable field

    - by Philippe
    Hi, I'm working on a legacy Java 1.4 project, and I have a factory that instantiates a csv file parser as a singleton. In my csv file parser, however, I have a HashSet that will store objects created from each line of my CSV file. All that will be used by a web application, and users will be uploading CSV files, possibly concurrently. Now my question is : what is the best way to prevent my list of objects to be modified by 2 users ? So far, I'm doing the following : final class MyParser { private File csvFile = null; private static Set myObjects = Collections.synchronizedSet(new HashSet); public synchronized void setFile(File file) { this.csvFile = file; } public void parse() FileReader fr = null; try { fr = new FileReader(csvFile); synchronized(myObjects) { myObjects.clear(); while(...) { // foreach line of my CSV, create a "MyObject" myObjects.add(new MyObject(...)); } } } catch (Exception e) { //... } } } Should I leave the lock only on the myObjects Set, or should I declare the whole parse() method as synchronized ? Also, how should I synchronize - both - the setting of the csvFile and the parsing ? I feel like my actual design is broken because threads could modify the csv file several times while a possibly long parse process is running. I hope I'm being clear enough, because myself am a bit confused on those multi-synchronization issues. Thanks ;-)

    Read the article

< Previous Page | 50 51 52 53 54 55 56 57 58 59 60 61  | Next Page >