Search Results

Search found 39339 results on 1574 pages for 'batch files'.

Page 69/1574 | < Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >

  • Share files - Ubuntu 12.4 and Windows 7 - one network - password not accepted

    - by gotqn
    I ask this question in SuperUser but no one helps me. I hope to get more attention here. I have three computers connected in one network by modem. I want to share files in this network in the most easy way (I have read about solutions using Samba). So, I have three machines: One with Windows 7 One with Windows XP One with Ubuntu 12.04 and I have the following situation: The windows PCs can share files between each other. The windows PCs can see that Ubuntu's one is in the network The PC with Ubuntu can see only the PC with Windows 7, but when I click on a folder it ask to enter the network password and it is not accepting it (I am 100% sure it's the correct one) Is there to fix this situation a little bit - at least to enable the file sharing between the Ubuntu and Windows 7 PCs or I should choose a different approach (please advice).

    Read the article

  • Good approach for hundreds of comsumers and big files

    - by ????? ???????
    I have several files (nearly 1GB each) with data. Data is a string line. I need to process each of these files with several hundreds of consumers. Each of these consumers does some processing that differs from others. Consumers do not write anywhere concurrently. They only need input string. After processing they update their local buffers. Consumers can easily be executed in parallel. Important: With one specific file each consumer has to process all lines (without skipping) in correct order (as they appear in file). The order of processing different files doesn't matter. Processing of a single line by one consumer is comparably fast. I expect less than 50 microseconds on Corei5. So now I'm looking for the good approach to this problem. This is going to be be a part of a .NET project, so please let's stick with .NET only (C# is preferable). I know about TPL and DataFlow. I guess that the most relevant would be BroadcastBlock. But i think that the problem here is that with each line I'll have to wait for all consumers to finish in order to post the new one. I guess that it would be not very efficient. I think that ideally situation would be something like this: One thread reads from file and writes to the buffer. Each consumer, when it is ready, reads the line from the buffer concurrently and processes it. The entry from the buffer shouldn't be deleted as one consumer reads it. It can be deleted only when all consumers have processed it. TPL schedules consumer threads itself. If one consumer outperforms the others, it shouldn't wait and can read more recent entries from the buffer. Am i right with this kind of approach? Whether yes or not, how can i implement the good solution? A bit was already discussed on StackOverflow: link

    Read the article

  • AD Stopping a Script and Writing a Value to a User's AD Account PPT Presentation

    - by Steven Maxon
    ‘This will launch the PPT in a GPO Dim ppt Set ppt = CreateObject("PowerPoint.Application") ppt.Visible = True ppt.Presentations.Open "C:\Scripts\Test.pptx" ‘This is the batch file at the end of the PPT that records the date, time, computer name and username echo "Logon Date:%date%,Logon Time:%time%,Computer Name:%computername%,User Name:%username%" >> \\servertest\g$\Tracking\LOGON.TXT ‘This is what I need but can’t find: I need the script to check a value in the Active Directory user’s account in the Web page: attribute that would shut off the script if the user has already competed reading the presentation. Could be as simple as writing XXXX. I need the value XXXX written to the Active Directory user’s account in the Web page: attribute when they finish reading the presentation after they click on the bat file so the script will not run again when they log in.

    Read the article

  • Thumbnailers of text and ogg files

    - by David López
    I use ubuntu 12.04 and I can see in nautilus thumbnailers of ogg (with embedded artwork) and text files, just like in figure It's a nice feature. I have a slow machine and I've installed Arch with LXDE and pcmanfm. I would like the same thumbnailers, but I can only see a few of them like in the figure I've installed nautilus, thunar, spacefm... and lots of different thumbnailers in my Arch machine, but I haven't be able to see the thumbnailers of text and ogg files. I think that maybe ubuntu uses a patched nautilus version with extended capabilities or something like this. Any idea? Thanks.

    Read the article

  • Batch View updates

    - by Terry
    I want to load a bunch of view definitions into SQL Server 2005 & 2008. I am using IF/ELSE logic to dynamically building Create or Alter statements that I then EXEC. This works fine. However, unless I get the order of the statements correct, I get errors if a view is dependent on another view that would be created in a later statement. Is there a way to turn of the valiation of the SQL statements until after they have all been entered? It seems like the worst of both worlds. SQL Server does late binding and so it won't propogate changes to tables and views, but you can't create a view without all the pieces being in place.

    Read the article

  • Nautilus tags/labels/marks/columns for folders/files

    - by madox2
    Is there any way how to mark folders or files with tags(or labels, new columns or whatever) in Nautilus? It would be nice to sort marked folders or files through this tags. Especially my first idea was to mark folders in my Movie directory with tags seen, not seen, must see, and so on. Then I realized it would be useful in any other workspaces with any custom tags... Is there any nautilus extension for this? Or any other file manager which can do this? It might look like this:

    Read the article

  • rm on a directory with millions of files

    - by BMDan
    Background: physical server, about two years old, 7200-RPM SATA drives connected to a 3Ware RAID card, ext3 FS mounted noatime and data=ordered, not under crazy load, kernel 2.6.18-92.1.22.el5, uptime 545 days. Directory doesn't contain any subdirectories, just millions of small (~100 byte) files, with some larger (a few KB) ones. We have a server that has gone a bit cuckoo over the course of the last few months, but we only noticed it the other day when it started being unable to write to a directory due to it containing too many files. Specifically, it started throwing this error in /var/log/messages: ext3_dx_add_entry: Directory index full! The disk in question has plenty of inodes remaining: Filesystem Inodes IUsed IFree IUse% Mounted on /dev/sda3 60719104 3465660 57253444 6% / So I'm guessing that means we hit the limit of how many entries can be in the directory file itself. No idea how many files that would be, but it can't be more, as you can see, than three million or so. Not that that's good, mind you! But that's part one of my question: exactly what is that upper limit? Is it tunable? Before I get yelled at--I want to tune it down; this enormous directory caused all sorts of issues. Anyway, we tracked down the issue in the code that was generating all of those files, and we've corrected it. Now I'm stuck with deleting the directory. A few options here: rm -rf (dir)I tried this first. I gave up and killed it after it had run for a day and a half without any discernible impact. unlink(2) on the directory: Definitely worth consideration, but the question is whether it'd be faster to delete the files inside the directory via fsck than to delete via unlink(2). That is, one way or another, I've got to mark those inodes as unused. This assumes, of course, that I can tell fsck not to drop entries to the files in /lost+found; otherwise, I've just moved my problem. In addition to all the other concerns, after reading about this a bit more, it turns out I'd probably have to call some internal FS functions, as none of the unlink(2) variants I can find would allow me to just blithely delete a directory with entries in it. Pooh. while [ true ]; do ls -Uf | head -n 10000 | xargs rm -f 2/dev/null; done ) This is actually the shortened version; the real one I'm running, which just adds some progress-reporting and a clean stop when we run out of files to delete, is: export i=0; time ( while [ true ]; do ls -Uf | head -n 3 | grep -qF '.png' || break; ls -Uf | head -n 10000 | xargs rm -f 2/dev/null; export i=$(($i+10000)); echo "$i..."; done ) This seems to be working rather well. As I write this, it's deleted 260,000 files in the past thirty minutes or so. Now, for the questions: As mentioned above, is the per-directory entry limit tunable? Why did it take "real 7m9.561s / user 0m0.001s / sys 0m0.001s" to delete a single file which was the first one in the list returned by "ls -U", and it took perhaps ten minutes to delete the first 10,000 entries with the command in #3, but now it's hauling along quite happily? For that matter, it deleted 260,000 in about thirty minutes, but it's now taken another fifteen minutes to delete 60,000 more. Why the huge swings in speed? Is there a better way to do this sort of thing? Not store millions of files in a directory; I know that's silly, and it wouldn't have happened on my watch. Googling the problem and looking through SF and SO offers a lot of variations on "find" that obviously have the wrong idea; it's not going to be faster than my approach for several self-evident reasons. But does the delete-via-fsck idea have any legs? Or something else entirely? I'm eager to hear out-of-the-box (or inside-the-not-well-known-box) thinking. Thanks for reading the small novel; feel free to ask questions and I'll be sure to respond. I'll also update the question with the final number of files and how long the delete script ran once I have that. Final script output!: 2970000... 2980000... 2990000... 3000000... 3010000... real 253m59.331s user 0m6.061s sys 5m4.019s So, three million files deleted in a bit over four hours.

    Read the article

  • Source control products that support linked/shared files?

    - by Ian Boyd
    We're interested in moving from a source control system that supports the concept of shared or linked files. A shared file means: a file modified in one project, is automatically updated changed in every other project that uses that same file. It does this without a developer having to request it, reverse-integrate it, ask for it, or even want it. We're trying to see if any other commonly used source-control systems can meet our needs, and include linked or shared files. My limited research shows that: Team Foundation Server doesn't support sharing files Subversion doesn't support sharing files (including Externals) CVS doesn't support sharing files (including Modules) Anything else? (besides our current source control product, obviously) References Subversion and shared files across repositories/projects? How to share files between CVS projects? Will TFS ever support shared files for projects under source control?

    Read the article

  • Batch Geocoding with Garmin Mapsource

    - by Mike Trader
    Please Note: I AM NOT LOOKING FOR AN ALTERNATIVE SOLUTION I lost track of this effort years ago but have need to geocode thousands of addresses nightly. I must use the very accurate database sitting on the machine, installed when the Nuvi map update installed Mapsource. When I contacted Garmin years ago, they expressed an interest in providing an API for this, but then I heard nothing and did not follow up. Their database is provided by navtec? I believe. Anyone have experience with that format? I posted on the Garmin Developer forum a while ago, but its a little lethargic over there :) Has anyone done this? Does anyone know how it might be done without an API; meaning database structure and calls? I'll take a solution in any language. Added: Garmin has expressed an interest in making this available to me. They just have not done it. I do not know the database format. I am NOT looking for an online solution or any other "alternative". This question is very specific. Contact Info: MikeTrader2 A T gmail D O T com Added: I offered a 400 pt bounty for this. Jeff Atwood then offered 400pts also. If you would like to see a solution to this, vote up the question and I will chase up Garmin and show there is interest in finally providing this. Please Note: I AM NOT LOOKING FOR AN ALTERNATIVE SOLUTION

    Read the article

  • How to keep "dot files" under version control?

    - by andrewsomething
    Etckeeper is a great tool for keeping track of changes to your configuration files in /etc A few key things about it really stand out. It can be used with a wide variety of VCSs: git, mercurial, darcs, or bzr. It also does auto commits daily and whenever you install, remove or upgrade package. It also keeps track of file permissions and user/group ownership metadata. I would also like to keep my "dot files" in my home directory under version control as well, preferably bazaar. Does anyone know if a tool like etckeeper exists for this purpose? Worst case, I imagine that a simple cron job running bzr add && bzr ci once or twice a day along with adding ~/Documents, ~/Music, ect to the .bzrignore Anyone already doing something similar with a script? While I'd prefer bazaar, other options might be interesting.

    Read the article

  • How to Transfer Files Between Your PC and Android Phone Wirelessly

    - by Zainul Franciscus
    Mounting your Android phone to transfer files is fast and efficient, but nothing beats the convenience of a wireless file transfer. Today, we’ll show you how to transfer files between Android and your computer without a USB cable Latest Features How-To Geek ETC How To Colorize Black and White Vintage Photographs in Photoshop How To Get SSH Command-Line Access to Windows 7 Using Cygwin The How-To Geek Video Guide to Using Windows 7 Speech Recognition How To Create Your Own Custom ASCII Art from Any Image How To Process Camera Raw Without Paying for Adobe Photoshop How Do You Block Annoying Text Message (SMS) Spam? Change Your MAC Address to Avoid Free Internet Restrictions Battlestar Galactica – Caprica Map of the 12 Colonies (Wallpaper Also Available) View Enlarged Versions of Thumbnail Images with Thumbnail Zoom for Firefox IntoNow Identifies Any TV Show by Sound Walk Score Calculates a Neighborhood’s Pedestrian Friendliness Factor Fantasy World at Twilight Wallpaper

    Read the article

  • idomatic batch processing of text in emacs?

    - by Stephen
    In python, you might do something like fout = open('out','w') fin = open('in') for line in fin: fout.write(process(line)+"\n") fin.close() fout.close() (I think it would be similar in many other languages as well). In emacs lisp, would you do something like (find-file 'out') (setq fout (current-buffer) (find-file 'in') (setq fin (current-buffer) (while moreLines (setq begin (point)) (move-end-of-line 1) (setq line (buffer-substring-no-properties begin (point)) ;; maybe (print (process line) fout) ;; or (save-excursion (set-buffer fout) (insert (process line))) (setq moreLines (= 0 (forward-line 1)))) (kill-buffer fin) (kill-buffer fout) which I got inspiration (and code) from here. Or should I try something entirely different? And how to remove the "" from the print statement? Thanks!

    Read the article

  • Backup files with rsync: error 23

    - by maria
    Hi I'm trying to make a backup of my /home to transfer all data from one computer to another. I wanted to save the backup on the same computer and than transfere it to another one. For safety reasons, I'm trying to learn how does it work on the computer without a lot of data (the new one) to be sure I won't delete something instead of copying it. I've run in terminal: sudo rsync -avz /home/maria /home/guest/backup and I had as the result: sent 58797801 bytes received 23050 bytes 4705668.08 bytes/sec total size is 100202958 speedup is 1.70 rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1060) [sender=3.0.7] I've tried once again, with the same result. I have no idea, which files were not transferred, what makes the whole backup useless for me (I wanted to do it automatically in order not to forget about something and loose it). On both computers I have the same system (Ubuntu 10.04). Rsync version: 3.0.7-1ubuntu1. Thanks for any tips

    Read the article

  • How to FTP transfer files to /var/www?

    - by jc.yin
    I'm new to linux and I've set up a web server with Ubuntu Desktop edition so I can practice with the GUI a bit before transitioning to Ubuntu Server. I've already set up a LAMP stack as well as FTP. Now I just need to know how to transfer my web files to the /var/www folder in Ubuntu. Previously I've worked on Mac OS and there's a central server for all the web files where I can FTP to. Anyone able to help me understand how do I FTP to the /var/www folder in Ubuntu? Thanks

    Read the article

  • cannot rename files when saving

    - by Botond Vajna
    I am using xubuntu 13.10. Since the last update, when I am saving files I cannot rename other files, there are missing buttons in context menu like rename, cut, copy, etc... let`s say I edited the file named style.css in firefox, and I want to save it, but i do not want to overwrite the original, I want to rename the original for ex: style_orig.css and save the edited file with the name style.css. for doing that I must open thunar, navigate to that file, and rename it, after that, I can save the file from firefox. let`s reformulate this question: if I want to overwrite a file, but preserve the original, then I must make a copy of the original. Well, I can not do this when saving the file from the application, I must open thunar and navigate to that file which is an additional task I have to do. It`s a waste of time.

    Read the article

  • Simplify a batch of git commands

    - by Bogdan Gusiev
    When I want to merge one branch to another I use to do the following(in this example master to custom): git checkout master && git pull && git checkout custom && git merge master Can somebody suggest how to simplify this? Thanks, Bogdan.

    Read the article

  • C# to loop through folder until it finds the correct files

    - by Liton Uddin
    I’m running a batch to do an update to my sql table. I’m using windows scheduler to run the batch file. Each day files come in at different time. Sometime they come in after my scheduled time therefore the batch file doesn’t run when there’s no file before the scheduled task in the folder. I want to create a c# program where it will loop through the folder until it finds the files and then move to the next step in the batch file. Basically my goal is to create a program that will look for those particular files and once they finds the correct files then it will start the update. I’m new to programming and need some direction. Can someone please help me with this? Thanks.

    Read the article

  • How do I export customized Libreoffice config files?

    - by carestad
    Is this possible? I want to make my own config file for my customizations that I can apply whenever I reinstall my system. For example, Ubuntu's default font color is just stupid. I want it to be BLACK and not dark grey. And I want to turn on autosave every 3rd minute and backup files. Is there a config file that I can change? The .libreoffice/* folders and XML files doesn't make sense, and they don't seem to change when I change stuff in LibreOffice. Could someone please help me out with this? Thanks.

    Read the article

  • skip-limit ignored for skippable exception thrown from writer

    - by ck
    i am working on a project with spring batch 2.0.2 and have skippable exception set up in the config. for exceptions thrown from the processor everything works fine. it skips and once the limit is execeed the job fails (or stops). for exceptions thrown from the writer (same chunk) it keeps skipping. the skip-limit doesnt seem to matter. maybe i missunderstood and skipping and writer don't go together or maybe i am missing a configuration. does anyone know how to skip issues - with a limit - within the writer properly?

    Read the article

  • Okular can't read pdf files

    - by hoang anh Nguyen
    I recently have installed Okular on my Ubuntu 14.04. The problem is when I open pdf files, okular gives me the error "Can not find a plugin which is able to handle the document being passed." When I ran Okular by Terminal, this is the message I get. okular(14100)/kdeui (KIconLoader): Error: standard icon theme "oxygen" not found! okular(14100)/kdeui (KIconLoader): Error: standard icon theme "oxygen" not found! okular(14100) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14100) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14100) KPixmapSequence::frameSize: No frame loaded okular(14100) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14100) KPixmapSequence::frameSize: No frame loaded okular(14100) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14100) KPixmapSequence::frameSize: No frame loaded okular(14100) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14100) KPixmapSequence::frameSize: No frame loaded okular(14100) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14100) KPixmapSequence::frameSize: No frame loaded okular(14100) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14100) KPixmapSequence::frameSize: No frame loaded okular(14100): No ksycoca4 database available! okular(14100)/kdecore (trader) KServiceTypeTrader::defaultOffers: KServiceTypeTrader: serviceType "okular/Generator" not found okular(14100)/kdecore (KConfigSkeleton) KCoreConfigSkeleton::writeConfig: okular(14100)/kdecore (KConfigSkeleton) KCoreConfigSkeleton::writeConfig: okular(14100)/kdecore (KConfigSkeleton) KCoreConfigSkeleton::writeConfig: okular(14100)/kdecore (KConfigSkeleton) KCoreConfigSkeleton::writeConfig: okular(14100)/kdecore (KConfigSkeleton) KCoreConfigSkeleton::writeConfig: okular(14100): No ksycoca4 database available! okular(14100)/kdecore (trader) mimeTypeSycocaServiceOffers: KMimeTypeTrader: mimeType "application/pdf" not found okular(14100): No ksycoca4 database available! okular(14100)/kdecore (trader): KMimeTypeTrader: couldn't find service type "okular/Generator" Please ensure that the .desktop file for it is installed; then run kbuildsycoca4. okular(14100)/okular (app) Okular::Document::openDocument: No plugin for mimetype '"application/pdf"'. okular(14100): Couldn't start knotify from knotify4.desktop: "KLauncher could not be reached via D-Bus. Error when calling start_service_by_desktop_path: The name org.kde.klauncher was not provided by any .service files " okular(14100)/kdeui (KNotification) KNotification::slotReceivedIdError: Error while contacting notify daemon "The name org.kde.knotify was not provided by any .service files" X Error: BadWindow (invalid Window parameter) 3 Major opcode: 20 (X_GetProperty) Resource id: 0x2a0002e okular(14110) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14110) KPixmapSequence::frameSize: No frame loaded okular(14110) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14110) KPixmapSequence::frameSize: No frame loaded okular(14110) KPixmapSequence::Private::loadSequence: Invalid pixmap specified. okular(14110) KPixmapSequence::frameSize: No frame loaded X Error: BadWindow (invalid Window parameter) 3 Major opcode: 20 (X_GetProperty) Resource id: 0x2a0001d X Error: BadWindow (invalid Window parameter) 3 Major opcode: 20 (X_GetProperty) Resource id: 0x2a0001d I would be much appreciated for any suggestion to solve this problem. Thanks a lot :)

    Read the article

  • How to Stream Media Files From any PC to Your PlayStation 3

    - by Zainul Franciscus
    Have you ever wished that you could stream video files from your computer over to your TV without actually hooking the two directly together? If you’ve got a PlayStation 3, you’re in luck, because that’s today’s geek lesson. If you’re wondering how to rip dvds to your PC, we’ve got you covered with an article on the subject, but you can stream video files that you’ve recorded yourself, or downloaded from somewhere. Image by playstation-themes Latest Features How-To Geek ETC Internet Explorer 9 RC Now Available: Here’s the Most Interesting New Stuff Here’s a Super Simple Trick to Defeating Fake Anti-Virus Malware How to Change the Default Application for Android Tasks Stop Believing TV’s Lies: The Real Truth About "Enhancing" Images The How-To Geek Valentine’s Day Gift Guide Inspire Geek Love with These Hilarious Geek Valentines Four Awesome TRON Legacy Themes for Chrome and Iron Anger is Illogical – Old School Style Instructional Video [Star Trek Mashup] Get the Old Microsoft Paint UI Back in Windows 7 Relax and Sleep Is a Soothing Sleep Timer Google Rolls Out Two-Factor Authentication Peaceful Early Morning by the Riverside Wallpaper

    Read the article

< Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >