Search Results

Search found 15977 results on 640 pages for 'home dir'.

Page 104/640 | < Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >

  • Pass string between two threads in java

    - by geeta
    I have to search a string in a file and write the matched lines to another file. I have a thread to read a file and a thread to write a file. I want to send the stringBuffer from read thread to write thread. Please help me to pass this. I amm getting null value passed. write thread: class OutputThread extends Thread{ /****************** Writes the line with search string to the output file *************/ Thread runner1,runner; File Out_File; public OutputThread() { } public OutputThread(Thread runner,File Out_File) { runner1 = new Thread(this,"writeThread"); // (1) Create a new thread. this.Out_File=Out_File; this.runner=runner; runner1.start(); // (2) Start the thread. } public void run() { try{ BufferedWriter bufferedWriter=new BufferedWriter(new FileWriter(Out_File,true)); System.out.println("inside write"); synchronized(runner){ System.out.println("inside wait"); runner.wait(); } System.out.println("outside wait"); // bufferedWriter.write(line.toString()); Buffer Buf = new Buffer(); bufferedWriter.write(Buf.buffers); System.out.println(Buf.buffers); bufferedWriter.flush(); } catch(Exception e){ System.out.println(e); e.printStackTrace(); } } } Read Thraed: class FileThread extends Thread{ Thread runner; File dir; String search_string,stats; File Out_File,final_output; StringBuffer sb = new StringBuffer(); public FileThread() { } public FileThread(CountDownLatch latch,String threadName,File dir,String search_string,File Out_File,File final_output,String stats) { runner = new Thread(this, threadName); // (1) Create a new thread. this.dir=dir; this.search_string=search_string; this.Out_File=Out_File; this.stats=stats; this.final_output=final_output; this.latch=latch; runner.start(); // (2) Start the thread. } public void run() { try{ Enumeration entries; ZipFile zipFile; String source_file_name = dir.toString(); File Source_file = dir; String extension; OutputThread out = new OutputThread(runner,Out_File); int dotPos = source_file_name.lastIndexOf("."); extension = source_file_name.substring(dotPos+1); if(extension.equals("zip")) { zipFile = new ZipFile(source_file_name); entries = zipFile.entries(); while(entries.hasMoreElements()) { ZipEntry entry = (ZipEntry)entries.nextElement(); if(entry.isDirectory()) { (new File(entry.getName())).mkdir(); continue; } searchString(runner,entry.getName(),new BufferedInputStream(zipFile.getInputStream(entry)),Out_File,final_output,search_string,stats); } zipFile.close(); } else { searchString(runner,Source_file.toString(),new BufferedInputStream(new FileInputStream(Source_file)),Out_File,final_output,search_string,stats); } } catch(Exception e){ System.out.println(e); e.printStackTrace(); } } /********* Reads the Input Files and Searches for the String ******************************/ public void searchString(Thread runner,String Source_File,BufferedInputStream in,File output_file,File final_output,String search,String stats) { int count = 0; int countw = 0; int countl=0; String s; String[] str; String newLine = System.getProperty("line.separator"); try { BufferedReader br2 = new BufferedReader(new InputStreamReader(in)); //OutputFile outfile = new OutputFile(); BufferedWriter bufferedWriter = new BufferedWriter(new FileWriter(output_file,true)); Buffer Buf = new Buffer(); //StringBuffer sb = new StringBuffer(); StringBuffer sb1 = new StringBuffer(); while((s = br2.readLine()) != null ) { str = s.split(search); count = str.length-1; countw += count; if(s.contains(search)){ countl++; sb.append(s); sb.append(newLine); } if(countl%100==0) { System.out.println("inside count"); Buf.setBuffers(sb.toString()); sb.delete(0,sb.length()); System.out.println("outside notify"); synchronized(runner) { runner.notify(); } //outfile.WriteFile(sb,bufferedWriter); //sb.delete(0,sb.length()); } } } synchronized(runner) { runner.notify(); } br2.close(); in.close(); if(countw == 0) { System.out.println("Input File : "+Source_File ); System.out.println("Word not found"); System.exit(0); } else { System.out.println("Input File : "+Source_File ); System.out.println("Matched word count : "+countw ); System.out.println("Lines with Search String : "+countl); System.out.println("Output File : "+output_file.toString()); System.out.println(); } } catch(Exception e){ System.out.println(e); e.printStackTrace(); } } }

    Read the article

  • Linux: modpost does not build anything

    - by waffleman
    I am having problems getting any kernel modules to build on my machine. Whenever I build a module, modpost always says there are zero modules: MODPOST 0 modules To troubleshoot the problem, I wrote a test module (hello.c): #include <linux/module.h> /* Needed by all modules */ #include <linux/kernel.h> /* Needed for KERN_INFO */ #include <linux/init.h> /* Needed for the macros */ static int __init hello_start(void) { printk(KERN_INFO "Loading hello module...\n"); printk(KERN_INFO "Hello world\n"); return 0; } static void __exit hello_end(void) { printk(KERN_INFO "Goodbye Mr.\n"); } module_init(hello_start); module_exit(hello_end); Here is the Makefile for the module: obj-m = hello.o KVERSION = $(shell uname -r) all: make -C /lib/modules/$(KVERSION)/build M=$(shell pwd) modules clean: make -C /lib/modules/$(KVERSION)/build M=$(shell pwd) clean When I build it on my machine, I get the following output: make -C /lib/modules/2.6.32-27-generic/build M=/home/waffleman/tmp/mod-test modules make[1]: Entering directory `/usr/src/linux-headers-2.6.32-27-generic' CC [M] /home/waffleman/tmp/mod-test/hello.o Building modules, stage 2. MODPOST 0 modules make[1]: Leaving directory `/usr/src/linux-headers-2.6.32-27-generic' When I make the module on another machine, it is successful: make -C /lib/modules/2.6.24-27-generic/build M=/home/somedude/tmp/mod-test modules make[1]: Entering directory `/usr/src/linux-headers-2.6.24-27-generic' CC [M] /home/somedude/tmp/mod-test/hello.o Building modules, stage 2. MODPOST 1 modules CC /home/somedude/tmp/mod-test/hello.mod.o LD [M] /home/somedude/tmp/mod-test/hello.ko make[1]: Leaving directory `/usr/src/linux-headers-2.6.24-27-generic' I looked for any relevant documentation about modpost, but found little. Anyone know how modpost decides what to build? Is there an environment that I am possibly missing? BTW here is what I am running: uname -a Linux waffleman-desktop 2.6.32-27-generic #49-Ubuntu SMP Wed Dec 1 23:52:12 UTC 2010 i686 GNU/Linux

    Read the article

  • DOS batch script to list folders that have a specific file in them

    - by Lee
    I'm trying to create a file that has a list of directories that have a specific file name in them. Let's say I'm trying to find directories that have a file named *.joe in them. I initially tried just a simple dir /ad *.joe dir_list.txt , but it searches the directory names for *.joe, so no go. Then I concluded that a for loop was probably my best bet. I started with for /d /r %a in ('dir *.joe /b') do @echo %a dir_list.txt and it looked like it wasn't executing the dir command. I added the "usebackq", but that seems to only work for the /F command extension. Ideas?

    Read the article

  • grails and flash movie

    - by ziftech
    Is it possibe to insert into GSP simple flash movie? I tried this way: <object type="application/x-shockwave-flash" data="${resource(dir:'flash',file:'movie.swf')}" width="400" height="400"> <param name="movie" value="${resource(dir:'flash',file:'movie.swf')}" /> <param name="bgcolor" value="#ffffff" /> <param name="AllowScriptAccess" value="always" /> <param name="flashvars" value="feed=${resource(dir:'flash',file:'movie.xml')}" /> <p>This widget requires Flash Player 9 or better</p> </object> It seems that movie is loaded but .xml and pictures are not...

    Read the article

  • Making an ANT Macro more reusable

    - by 1ndivisible
    I have a simple macro (simplified version below). At the moment it assumes that there will be a single value for a single argument, however there might be multiple values for that argument. How can I pass in 0+ values for that argument so that the macro is usable in situations where I need to pass in 0+ values for that argument, not just a single value <macrodef name="test"> <attribute name="target.dir" /> <attribute name="arg.value" /> <sequential> <java jar="${some.jar}" dir="@{target.dir}" fork="true" failonerror="true"> <arg value="-someargname=@{arg.value}"/> </java> </sequential> </macrodef>

    Read the article

  • Excluding some classes from the cobertura report doesn't work

    - by user357480
    I tried to exclude some classes from cobertura as specified in this site <cobertura-instrument todir="${voldemort.instrumented.dir}" datafile="${cobertura.instrument.file}"> <classpath refid="tools-classpath" /> <ignore regex=".*\.xsd" /> <fileset dir="${voldemort.dist.dir}/classes"> <include name="**/*.class" /> <exclude name="**/client/protocol/pb/*.class"/> <exclude name="**/server/http/*.class"/> </fileset> </cobertura-instrument> but that doesn't work help me out, pleaseeeeeeeeeeee

    Read the article

  • Create a new output file in ant replace task

    - by Sathish
    Ant replace task does a in-place replacement without creating a new file. The below snippet replaces tokens in any of the *.xml files with the corresponding values from my.properties file. <replace dir="${projects.prj.dir}/config" replacefilterfile="${projects.prj.dir}/my.properties" includes="*.xml" summary="true"/'> I want those *.xml files that had their tokens replaced be created as *.xml.filtered (for e.g.) and still have the original *.xml. Is this possible in Ant with some smart combination of tasks and concepts ?

    Read the article

  • Why doesn't Perl threading work when I call readdir beforehand?

    - by Kevin
    Whenever I call readdir before I create a thread, I get an error that looks like this: perl(2820,0x7fff70c33ca0) malloc: * error for object 0x10082e600: pointer being freed was not allocated * set a breakpoint in malloc_error_break to debug Abort trap What's strange is that it happens when I call readdir before I create a thread (i.e. readdir is not called in any concurrent code). I don't even use the results from readdir, just making the call to it seems to screw things up. When I get rid of it, things seem to work fine. Some example code is below: opendir(DIR, $someDir); my @allFiles = readdir(DIR); close(DIR); my $thread = threads-create(\&sub1); $thread-join(); sub sub1 { print "in thread\n" }

    Read the article

  • Heroku only initializes some of my models.

    - by JayX
    So I ran heroku db:push And it returned Sending schema Schema: 100% |==========================================| Time: 00:00:08 Sending indexes schema_migrat: 100% |==========================================| Time: 00:00:00 projects: 100% |==========================================| Time: 00:00:00 tasks: 100% |==========================================| Time: 00:00:00 users: 100% |==========================================| Time: 00:00:00 Sending data 8 tables, 70,551 records groups: 100% |==========================================| Time: 00:00:00 schema_migrat: 100% |==========================================| Time: 00:00:00 projects: 100% |==========================================| Time: 00:00:00 tasks: 100% |==========================================| Time: 00:00:02 authenticatio: 100% |==========================================| Time: 00:00:00 articles: 100% |==========================================| Time: 00:08:27 users: 100% |==========================================| Time: 00:00:00 topics: 100% |==========================================| Time: 00:01:22 Resetting sequences And when I went to heroku console This worked >> Task => Task(id: integer, topic: string, content: string, This worked >> User => User(id: integer, name: string, email: string, But the rest only returned something like >> Project NameError: uninitialized constant Project /home/heroku_rack/lib/console.rb:150 /home/heroku_rack/lib/console.rb:150:in `call' /home/heroku_rack/lib/console.rb:28:in `call' >> Authentication NameError: uninitialized constant Authentication /home/heroku_rack/lib/console.rb:150 /home/heroku_rack/lib/console.rb:150:in `call' update 1: And when I typed >> ActiveRecord::Base.connection.tables it returned => ["projects", "groups", "tasks", "topics", "articles", "schema_migrations", "authentications", "users"] Using heroku's SQL console plugin I got SQL> show tables +-------------------+ | table_name | +-------------------+ | authentications | | topics | | groups | | projects | | schema_migrations | | tasks | | articles | | users | +-------------------+ So I think they are existing in heroku's database already. There is probably something wrong with rack db:migrate update 2: I ran rack db:migrate locally in both production and development modes and nothing wrong happened. But when I ran it on heroku it only returned: $ heroku rake db:migrate (in /disk1/home/slugs/389817_1c16250_4bf2-f9c9517b-bdbd-49d9-8e5a-a87111d3558e/mnt) $ Also, I am using sqlite3 update 3: so I opened up heroku console and typed in the following command class Authentication < ActiveRecord::Base;end Amazingly I was able to call Authentication class, but once I exited, nothing was changed.

    Read the article

  • Android :WindowManager$BadTockenException on Spinner Click

    - by Miya
    Hi, I have a spinner in my home.class. When I click on the spinner, the process is stopped showing exception that WindowManager$BadTockenException is caught. I am calling this home.class from main.class which extends ActivityGroup. If I am simply run only the home.class, the spinner is showing all items. But the problem is only with calling home.class from main.class. The following are my code. Please tell me why this is happened. main.class public class main extends ActivityGroup { public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); Intent intent=new Intent(this,home.class); View view=getLocalActivityManager().startActivity("1", intent.addFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP)).getDecorView(); setContentView(view); } } home.class String[] country={"Please selects","US","INDIA","UK"}; Spinner s2 = (Spinner) findViewById(R.id.spinnerCountry); ArrayAdapter<CharSequence> adapterCountry=new ArrayAdapter(this,android.R.layout.simple_spinner_item,country); adapterCountry.setDropDownViewResource(android.R.layout.simple_spinner_dropdown_item); s2.setAdapter(adapterCountry); s2.setOnItemSelectedListener(new OnItemSelectedListener() { public void onItemSelected( AdapterView<?> parent, View view, int position, long id) { countryName=country[position]; } public void onNothingSelected(AdapterView<?> parent) { countryName=country[0]; } }); Stack Thread [<1 main] (Suspended (exception WindowManager$BadTokenException)) AlertDialog(Dialog).show() line: 245 AlertDialog$Builder.show() line: 802 Spinner.performClick() line: 260 View$PerformClick.run() line: 9080 ViewRoot(Handler).handleCallback(Message) line: 587 ViewRoot(Handler).dispatchMessage(Message) line: 92 Looper.loop() line: 123 ActivityThread.main(String[]) line: 3647 Method.invokeNative(Object, Object[], Class, Class[], Class, int, boolean) line: not available [native method] Method.invoke(Object, Object...) line: 507 ZygoteInit$MethodAndArgsCaller.run() line: 839 ZygoteInit.main(String[]) line: 597 NativeStart.main(String[]) line: not available [native method] Thank You....

    Read the article

  • How do I use pdo's prepared statement for order by and limit clauses(or can I?If not,what should I r

    - by user198729
    $sql = "SELECT * FROM table ORDER BY :sort :dir LIMIT :start, :results"; $stmt = $dbh->prepare($sql); $stmt->execute(array( 'sort' => $_GET['sort'], 'dir' => $_GET['dir'], 'start' => $_GET['start'], 'results' => $_GET['results'], ) ); I tried to use prepare to do the job,but $stmt->fetchAll(PDO::FETCH_ASSOC); returns nothing. Can someone point out what's the wrong thing I am doing?

    Read the article

  • Create Directory, 'cd' to it and download a file pipeline in Perl

    - by neversaint
    I have a file that looks like this: ftp://url1/files1.tar.gz dir1 ftp://url2/files2.txt dir2 .... many more... What I want to do are these steps: Create directory based on column 2 Unix 'cd' to that directory Download file with 'wget' based on column1 But how come this approach of mine doesn't work while(<>) { chomp; my ($url,$dir) = split(/\t/,$_); system("mkdir $dir"); system("cd $dir"); # Fail here system("wget $url"); # here too } What's the right way to do it?

    Read the article

  • Configure main domain and sub domain with diffrent resourcs in apache server

    - by Ritesh Patadiya
    I have an issue to creating a sub domain in apache server. Normally we can do that by following way. <VirtualHost *:80> ServerName www.maindomain.com ServerAlias *.maindomain.com DocumentRoot "/home/abc/xyz" <Directory "/home/abc/xyz"> AllowOverride All Allow from All </Directory> </VirtualHost> In above example both main and sub domain share same Directory. But my requirement is main domain have its own resource and rest of sub domains have other resource. I want to do something like this. <VirtualHost *:80> ServerName www.maindomain.com DocumentRoot "/home/abc/xyz" <Directory "/home/abc/xyz"> AllowOverride All Allow from All </Directory> </VirtualHost> <VirtualHost *:80> ServerName xyz.maindomain.com ServerAlias *.maindomain.com DocumentRoot "/home/ghi/pqr" <Directory "/home/ghi/pqr"> AllowOverride All Allow from All </Directory> </VirtualHost> Above thing didn't work for me

    Read the article

  • Shell script for generating HTML out put

    - by user1638016
    The following script is generating the desired out put but not redirecting the result to /home/myuser/slavedelay.html #!/bin/bash host=<ip> echo $host user=usr1 password=mypass threshold=300 statusok=OK statuscritical=CRITICAL for i in ert7 ert9 do echo "<html>" > /home/myuser/slavedelay.html if [ "$i" == "ert7" ]; then slvdelay=`mysql -u$user -p$password -h<ip> -S /backup/mysql/mysql.sock -e 'show slave status\G' | grep Seconds_Behind_Master | sed -e 's/ *Seconds_Behind_Master: //'` if [ $slvdelay -ge $threshold ]; then echo "<tr><td>$i</td><td>CRITICAL</td>" >> /home/myuser/slavedelay.html echo "<tr><td>$i</td><td>CRITICAL</td>" else echo "<tr><td>$i</td><td>OK</td>" >> /home/myuser/slavedelay.html echo "<tr><td>$i</td><td>OK</td>" fi fi done echo "</html>" >> /home/myuser/slavedelay.html If I cat the output file /home/myuser/slavedelay.html it gives. <html> </html> Execution result : sh slave_delay.sh <tr><td>sdb7</td><td>OK</td>

    Read the article

  • USB device Set Attribute in C#

    - by p19lord
    I have this bit of code: DriveInfo[] myDrives = DriveInfo.GetDrives(); foreach (DriveInfo myDrive in myDrives) { if (myDrive.DriveType == DriveType.Removable) { string path = Convert.ToString(myDrive.RootDirectory); DirectoryInfo mydir = new DirectoryInfo(path); String[] dirs = new string[] {Convert.ToString(mydir.GetDirectories())}; String[] files = new string[] {Convert.ToString(mydir.GetFiles())}; foreach (var file in files) { File.SetAttributes(file, ~FileAttributes.Hidden); File.SetAttributes(file, ~FileAttributes.ReadOnly); } foreach (var dir in dirs) { File.SetAttributes(dir, ~FileAttributes.Hidden); File.SetAttributes(dir, ~FileAttributes.ReadOnly); } } } I have a problem. It is trying the code for Floppy Disk drive first which and because no Floppy disk in it, it threw the error The device is not ready. How can I prevent that?

    Read the article

  • Getting a `free()` error when deallocating with `delete` in the backtrace

    - by wonko
    I got the following error from gdb: *** glibc detected *** /.root0/autohome/u132/hsreekum/ipopt/ipopt/debug/Ipopt/examples/ex3/ex3: free(): invalid next size (fast): 0x0000000120052b60 *** Here's the backtrace: #0 0x000000555626b264 in raise () from /lib/libc.so.6 #1 0x000000555626cc6c in abort () from /lib/libc.so.6 #2 0x00000055562a7b9c in __libc_message () from /lib/libc.so.6 #3 0x00000055562aeabc in malloc_printerr () from /lib/libc.so.6 #4 0x00000055562b036c in free () from /lib/libc.so.6 #5 0x000000555561ddd0 in Ipopt::TNLPAdapter::~TNLPAdapter () from /home/ba01/u132/hsreekum/ipopt/ipopt/build/lib/libipopt.so.1 #6 0x00000055556a9910 in Ipopt::GradientScaling::~GradientScaling () from /home/ba01/u132/hsreekum/ipopt/ipopt/build/lib/libipopt.so.1 #7 0x00000055557241b8 in Ipopt::OrigIpoptNLP::~OrigIpoptNLP () from /home/ba01/u132/hsreekum/ipopt/ipopt/build/lib/libipopt.so.1 #8 0x00000055556ae7f0 in Ipopt::IpoptAlgorithm::~IpoptAlgorithm () from /home/ba01/u132/hsreekum/ipopt/ipopt/build/lib/libipopt.so.1 #9 0x0000005555602278 in Ipopt::IpoptApplication::~IpoptApplication () from /home/ba01/u132/hsreekum/ipopt/ipopt/build/lib/libipopt.so.1 #10 0x0000005555614428 in FreeIpoptProblem () from /home/ba01/u132/hsreekum/ipopt/ipopt/build/lib/libipopt.so.1 #11 0x0000000120001610 in main () at ex3.c:169` And here's the code for Ipopt::TNLPAdapter::~TNLPAdapter () TNLPAdapter::~TNLPAdapter() { delete [] full_x_; delete [] full_lambda_; delete [] full_g_; delete [] jac_g_; delete [] c_rhs_; delete [] jac_idx_map_; delete [] h_idx_map_; delete [] x_fixed_map_; delete [] findiff_jac_ia_; delete [] findiff_jac_ja_; delete [] findiff_jac_postriplet_; delete [] findiff_x_l_; delete [] findiff_x_u_; } My question is : why does free() throw an error when ~TNLPAdapter() uses delete[]? Also, I would like to step through ~TNLPAdapter() so I can see which deallocation causes the error. I believe the error occurs in the external library (IPOPT) but I have compiled it with debug flags on ; is this sufficient?

    Read the article

  • No class def find error

    - by John Smith
    I have the following piece of code which helps me to read the paths of all the files in a directory and its subdirectories : File dir = new File("directory path"); try { System.out.println("Getting all files in " + dir.getCanonicalPath() + " including those in subdirectories"); } catch (IOException e1) { // TODO Auto-generated catch block e1.printStackTrace(); } List<File> files = (List<File>) FileUtils.listFiles(dir, TrueFileFilter.INSTANCE, TrueFileFilter.INSTANCE); for (File file : files) { try { System.out.println("file: " + file.getCanonicalPath()); } catch (IOException e1) { // TODO Auto-generated catch block e1.printStackTrace(); } } The problem is that I get a java.lang.NoClassDefFoundError: org/apache/commons/io/filefilter/TrueFileFilter error. I've imported the necessary libraries. I mention that I work on a plugin project. This code runs in a class with main function. I don't understand why it gave me this error. Thank you !

    Read the article

  • div content change only jquery Mobile

    - by user3659748
    I have that : <div data-role="page" id="Home"> <div data-role="header" > <h2 class="header">My app</h2> </div> <div data-role="content"> </div> <div data-role="footer" data-position="fixed"> <div data-role="navbar"> <ul> <li><a href="partials/home.html" data-icon="home" data-transition="slide">Home</a></li> <li><a href="partials/about.html" data-icon="info">About</a></li> </ul> </div> </div> </div> </body> I want when a users click on links le content slide on a div content of for exemple home.html who have just that : home that is possible or not ? Thanks :)

    Read the article

  • bashscript for file search and replace!

    - by D3orn
    Hey I try to write a littel bash script. This should copy a dir and all files in it. Then it should search each file and dir in this copied dir for a String (e.g @ForTestingOnly) and then this save the line number. Then it should go on and count each { and } as soon as the number is equals it should save againg the line number. = it should delete all the lines between this 2 numbers. I'm trying to make a script which searchs for all this annotations and then delete the method which is directly after this ano. Thx for help...

    Read the article

  • How to Assign a Static IP to an Ubuntu 10.04 Desktop Computer

    - by Mysticgeek
    If you have a home network with several computers, assigning them static IP addresses can make troubleshooting easier. Today we take a look at switching from DHCP to a static IP in Ubuntu. Assign a Static IP Using Static IPs prevents address conflicts between machines and can allow easier access to them. If you have a small home network and are satisfied with the machines getting their IP address automatically via DHCP, there won’t be anything gained by using static addresses. Using Static IPs isn’t necessarily for the average user, but if you’re a geek who wants to know the address assigned to each machine, it can allow for faster troubleshooting.  To change your Ubuntu machine to a Static IP go to System \ Preferences \ Network Connections. In our example, we’re on a wired system so click on the Wired tab, then select Auto eth0 and click on Edit. Select the IPv4 settings tab, change Method to Manual, click the Add button. Then type in the Static IP Address, Subnet Mask, DNS Servers, and Default Gateway. Then click Apply when you’re finished. Make sure to hit Enter after typing in the Default Gateway otherwise it will revert back to 0.0.0.0 You’ll need to enter in your admin password before the changes go into affect. To verify the changes have been made successfully launch a Terminal session and type in ifconfig at the command prompt, or follow these directions. You also might want to ping the address from another machine to make sure everything is communicating. If you want to assign a Static IP to your Windows machines, check out our article on how to assign a Static IP on Windows systems (make sure to browse the comments as our readers have some good suggestions).  Whether you have a small office or home network set up with a server and several machines, using a Static IP on each device can help you manage them easily. Again, it isn’t for everyone as it really depends on how your network is setup and the way you use it. Similar Articles Productive Geek Tips Change Ubuntu Desktop from DHCP to a Static IP AddressAllow Remote Control To Your Desktop On UbuntuAssign Custom Shortcut Keys on Ubuntu LinuxKeyboard Ninja: 21 Keyboard Shortcut ArticlesChange Ubuntu Server from DHCP to a Static IP Address TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips HippoRemote Pro 2.2 Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server TubeSort: YouTube Playlist Organizer XPS file format & XPS Viewer Explained Microsoft Office Web Apps Guide Know if Someone Accessed Your Facebook Account Shop for Music with Windows Media Player 12 Access Free Documentaries at BBC Documentaries

    Read the article

  • Reasons to Use a VM For Development

    - by George Stocker
    Background: I work at a start-up company, where one team uses Virtual Machines to connect to a remote server to do their development, and another team (the team I'm on) uses local IIS/SQL Server 2005/Visual Studio installations to conduct work. Team VM is located about 1000 miles from Team Non-VM, and the servers the VMs run off of are located near Team VM (Latency, for those that are wondering, is about 50ms). A person high in the company is pushing for Team Non-VM to use virtual machines for programming, development, and testing. The latter point we agree on -- we want Virtual Machines to test configurations and various aspects of the web application in a 'clean' state. The Problem: What we don't agree on is having developers using RDP to connect to a desktop remotely that contains Visual Studio, SQL Server, and IIS to do the same development we could do locally on our laptops. I've tried the VM set-up, and besides the color issue, there is a latency issue that is rather noticeable, not to mention that since we're a start-up, a good number of employees work from home on occasion with our work laptops, and this move would cut off the laptops. They'd be turned in. Reasons to Use Remote VMs for Development (Not Testing!): Here are the stated reasons that this person wants us to use VMs: They work for TeamVM. They keep the source code "safe". If we want to work from home, we could just use our home PCs. Licenses (I don't know what the argument is, only that it's been used). Reasons not to use Remote VMs for Development: Here are the stated reasons why we don't want to use VMs: We like working from home. We get a lot done on our own time. We're not going to use our Home PCs to do work related stuff. The Latency is noticeable. Support for the VMs (if they go down, or if we need a new VM) takes a while. We don't have administrative privileges on the VM, and are unable to change settings as needed. What I'm looking for from the community is this: What reasons would you give for not using VMs for development? Keep in mind these are remote VMs -- this isn't a VM running on a local desktop. It's using the laptop (or a desktop) as a thin client for a remote VM. Also, on the other side of the coin: Is there something we're missing that makes VMs more palatable for development? Edit: I think 'safe' is used in term of corporate espionage, or more correctly if the Laptop gets stolen, the person who stole would have access to our source code. The former (as we've pointed out, is always going to be a possibility -- companies stop that with litigation, there isn't a technical solution (so far as I can see)). The latter point is ( though I don't know its usefulness in a corporate scenario) mitigated by Truecrypt'ing the entire volume.

    Read the article

  • Installing Yaws server on Ubuntu 12.04 (Using a cloud service)

    - by Lee Torres
    I'm trying to get a Yaws web server working on a cloud service (Amazon AWS). I've compilled and installed a local copy on the server. My problem is that I can't get Yaws to run while running on either port 8000 or port 80. I have the following configuration in yaws.conf: port = 8000 listen = 0.0.0.0 docroot = /home/ubuntu/yaws/www/test dir_listings = true This produces the following successful launch/result: Eshell V5.8.5 (abort with ^G) =INFO REPORT==== 16-Sep-2012::17:21:06 === Yaws: Using config file /home/ubuntu/yaws.conf =INFO REPORT==== 16-Sep-2012::17:21:06 === Ctlfile : /home/ubuntu/.yaws/yaws/default/CTL =INFO REPORT==== 16-Sep-2012::17:21:06 === Yaws: Listening to 0.0.0.0:8000 for <3> virtual servers: - http://domU-12-31-39-0B-1A-F6:8000 under /home/ubuntu/yaws/www/trial - =INFO REPORT==== 16-Sep-2012::17:21:06 === Yaws: Listening to 0.0.0.0:4443 for <1> virtual servers: - When I try to access the the url (http://ec2-72-44-47-235.compute-1.amazonaws.com), it never connects. I've tried using paping to check if port 80 or 8000 is open(http://code.google.com/p/paping/) and I get a "Host can not be resolved" error, so obviously something isn't working. I've also tried setting the yaws.conf so its at Port 80, appearing like this: port = 8000 listen = 0.0.0.0 docroot = /home/ubuntu/yaws/www/test dir_listings = true and I get the following error: =ERROR REPORT==== 16-Sep-2012::17:24:47 === Yaws: Failed to listen 0.0.0.0:80 : {error,eacces} =ERROR REPORT==== 16-Sep-2012::17:24:47 === Can't listen to socket: {error,eacces} =ERROR REPORT==== 16-Sep-2012::17:24:47 === Top proc died, terminate gserv =ERROR REPORT==== 16-Sep-2012::17:24:47 === Top proc died, terminate gserv =INFO REPORT==== 16-Sep-2012::17:24:47 === application: yaws exited: {shutdown,{yaws_app,start,[normal,[]]}} type: permanent {"Kernel pid terminated",application_controller," {application_start_failure,yaws,>>>>>>{shutdown,>{yaws_app,start,[normal,[]]}}}"} I've also opened up the port 80 using iptables. Running sudo iptables -L gives this output: Chain INPUT (policy ACCEPT) target prot opt source destination ACCEPT tcp -- ip-192-168-2-0.ec2.internal ip-192-168-2-16.ec2.internal tcp dpt:http ACCEPT tcp -- 0.0.0.0 anywhere tcp dpt:http ACCEPT all -- anywhere anywhere ctstate RELATED,ESTABLISHED ACCEPT tcp -- anywhere anywhere tcp dpt:http ACCEPT tcp -- anywhere anywhere tcp dpt:http Chain FORWARD (policy ACCEPT) target prot opt source destination Chain OUTPUT (policy ACCEPT) target prot opt source destination In addition, I've gone to the security group panel in the Amazon AWS configuration area, and add ports 80, 8000, and 8080 to ip source 0.0.0.0 Please note: if you try to access the URL of the virtual server now, it likely won't connect because I'm not running currently running the yaws daemon. I've tested it when I've run yaws either through yaws or yaws -i Thanks for the patience

    Read the article

  • How to access files on a drive from an older system, mounted in a new system?

    - by David Thomas
    I've recently built a new system, after a rather large physical injury was sustained by my previous system (a precarious balance, and gravity, were not a happy mix). Surprisingly the /home drive of that system appears to have more-or-less survived the trauma. However... I decided to use a fresh drive for / (and swap) partition(s), and another fresh drive for the new /home. Now that's working, I decided to install the old /home drive (that I had assumed until now would be entirely dead and without capacity for use) into the new system to recover the files and data (so far as is possible). At this point I've run into a snag: I have no idea how to go about this (with Windows it was relatively easy, the new drive would be the latest character of the alphabet, and go from there). With 'disk utility' (System - Administration - Disk Utitlity) I've worked out which drive it is (/dev/sda) but clicking on 'mount' produces an error: 1: helper failed with: mount: according to mtab, /dev/sdb1 is already mounted on / mount failed ...if it is mounted on / I can't see it. I'm also moderately confused by the disk (device /dev/sda) being referred to as /dev/sdb1. Any and all insights would be incredibly welcome (I've already voted for: Idea #9063: New internal hard drives default automount at Brainstorm). Edited in response to Roland's request for a screenshot of disk utility: Details (so far as I know them): 40GB disk is / and swap, 1.0 TB Samsung is /home 1.0 TB Hitachi is from the old system (and was the old /home drive). Output from sudo fdisk -l pasted below: Disk /dev/sda: 1000.2 GB, 1000204886016 bytes 255 heads, 63 sectors/track, 121601 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x000bef00 Device Boot Start End Blocks Id System /dev/sda1 1 121601 976760001 83 Linux Disk /dev/sdb: 40.0 GB, 40018599936 bytes 255 heads, 63 sectors/track, 4865 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x00037652 Device Boot Start End Blocks Id System /dev/sdb1 * 1 4742 38084608 83 Linux /dev/sdb2 4742 4866 993281 5 Extended /dev/sdb5 4742 4866 993280 82 Linux swap / Solaris Disk /dev/sdc: 1000.2 GB, 1000204886016 bytes 255 heads, 63 sectors/track, 121601 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x000e8d46 Device Boot Start End Blocks Id System /dev/sdc1 1 121602 976760832 83 Linux

    Read the article

  • How migrate my keyring (containing ssh passprases, nautilus remote filesystem, pgp passwords) and network manager connections?

    - by con-f-use
    I changed the disk on my laptop and installed Ubuntu on the new disk. Old disk had 12.04 upgraded to 12.10 on it. Now I want to copy my old keyring with WiFi passwords, ftp passwords for nautilus and ssh key passphrases. I have the whole data from the old disk available (is now a USB disk and I did not delete the old data yet or do anything with it - I could still put it in the laptop and boot from it like nothing happened). On the new disc that is now in my laptop, I have installed 12.10 with the same password, user-id and username as on the old disk. Then I copied a few important config files from the old disk (e.g. ~/.firefox/, ~/.mozilla, ~/.skype and so on, which all worked fine... except for the key ring: The old methods of just copying ~/.gconf/... and ~/.gnome2/keyrings won't work. Did I miss something? 1. Edit: I figure one needs to copy files not located in the users home directory as well. I copied the whole old /home/confus (which is my home directory) to the fresh install to no effect. That whole copy is now reverted to the fresh install's home directory, so my /home/confus is as it was the after fresh install. 2. Edit: The folder /etc/NetworkManager/system-connections seems to be the place for WiFi passwords. Could be that /usr/share/keyrings is important as well for ssh keys - that's the only sensible thing that a search came up with: find /usr/ -name "*keyring* 3. Edit: Still no ssh and ftp passwords from the keyring. What I did: Convert old hard drive to usb drive Put new drive in the laptop and installed fresh version of 12.10 there (same uid, username and passwort) Booted from old hdd via USB and copied its /etc/NetwrokManager/system-connections, ~/.gconf/ and ~/.gnome2/keyrings, ~/.ssh over to the new disk. Confirmed that all keys on the old install work Booted from new disk Result: No passphrase for ssh keys, no ftp passwords in keyring. At least the WiFi passwords are migrated. 4. Edit: Boutny! Ending soon... 5. Edit: Keyring's now in ./local/share/keyrings/. Also interesting .gnupg

    Read the article

  • Reasons to Use a VM For Development

    - by George Stocker
    Background: I work at a start-up company, where one team uses Virtual Machines to connect to a remote server to do their development, and another team (the team I'm on) uses local IIS/SQL Server 2005/Visual Studio installations to conduct work. Team VM is located about 1000 miles from Team Non-VM, and the servers the VMs run off of are located near Team VM (Latency, for those that are wondering, is about 50ms). A person high in the company is pushing for Team Non-VM to use virtual machines for programming, development, and testing. The latter point we agree on -- we want Virtual Machines to test configurations and various aspects of the web application in a 'clean' state. The Problem: What we don't agree on is having developers using RDP to connect to a desktop remotely that contains Visual Studio, SQL Server, and IIS to do the same development we could do locally on our laptops. I've tried the VM set-up, and besides the color issue, there is a latency issue that is rather noticeable, not to mention that since we're a start-up, a good number of employees work from home on occasion with our work laptops, and this move would cut off the laptops. They'd be turned in. Reasons to Use Remote VMs for Development (Not Testing!): Here are the stated reasons that this person wants us to use VMs: They work for TeamVM. They keep the source code "safe". If we want to work from home, we could just use our home PCs. Licenses (I don't know what the argument is, only that it's been used). Reasons not to use Remote VMs for Development: Here are the stated reasons why we don't want to use VMs: We like working from home. We get a lot done on our own time. We're not going to use our Home PCs to do work related stuff. The Latency is noticeable. Support for the VMs (if they go down, or if we need a new VM) takes a while. We don't have administrative privileges on the VM, and are unable to change settings as needed. What I'm looking for from the community is this: What reasons would you give for not using VMs for development? Keep in mind these are remote VMs -- this isn't a VM running on a local desktop. It's using the laptop (or a desktop) as a thin client for a remote VM. Also, on the other side of the coin: Is there something we're missing that makes VMs more palatable for development? Edit: I think 'safe' is used in term of corporate espionage, or more correctly if the Laptop gets stolen, the person who stole would have access to our source code. The former (as we've pointed out, is always going to be a possibility -- companies stop that with litigation, there isn't a technical solution (so far as I can see)). The latter point is ( though I don't know its usefulness in a corporate scenario) mitigated by Truecrypt'ing the entire volume.

    Read the article

< Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >