Search Results

Search found 377 results on 16 pages for 'gary woods'.

Page 3/16 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Caspol, VMs, Mapped Drives, VS2010

    - by Simon Woods
    Hi I have a VM (Win7 32 bit) with VS2010 installed. I have a drive mapped into it from the host machine (VM 64 bit), when I have some of my VS2010 projects and to where I am building them. One of my projects is looking to load an assembly. If I copy that assembly to a local drive, the program ruins fine. If I leave it on the mapped drive, then I get an error Exception is: FileLoadException - Could not load file or assembly 'file:///Z:\BusinessTier\bin\Debug\BusinessTier.dll I am unsure whether or not I need to run Caspol. There is another post on SO which pointed me to a post which indicated that VS2008 SP1+ removed the need for caspol wrt network drives, but I wondered if I still needed to because I am in a VM. I have tried running the following on the host machine in an attempt to give permissions to VS inside the VM, but to no avail C:\Windows\Microsoft.NET\Framework\v4.0.30128>caspol -m -ag 1.2 -url file://g:\* FullTrust where g:* is the drive being mapped into the VM (as drive z:) What am I missing (apart from understanding!) Thx Simon

    Read the article

  • what's the right way to use scala.io.Source?

    - by woods
    In many examples, it is described that you can use scala.io.Source to read a whole file like this: val str = scala.io.Source.fromFile("test.txt").mkString() But closing the underlying stream is not mentioned. Why does Scala not provide a convenient way to do that such as with clause in Python? It looks useful but not difficult. Is there any other better way to do that safely in Scala, i means to read a whole file?

    Read the article

  • Git - tidying up a repo

    - by Simon Woods
    Hi I have got my repo into a bit of a state and want to be able to work my way out of it The repo looks a bit like this (A1, B1, C1 etc are obviously commits) A1 ---- A2 ---- A3 ---- A4 ---- A5 ---- A6 ---- A7 ---- A8 / (from a remote repo) B1 ---- B2 --------------------------------- | \ \ C1 ---------------------------------C2 \ / D1 --- D2 --- D3 --- D4 --- D5 --- D6 Ideally I'd like to be able to remove all the revisions (with rebase?) on the B, C and D lines (I'm loathed to say branches simply because there are now no local branches on these lines except ref branches to the remote repo) and try to merge in the remote repo again, perhaps in a better way. I'd be grateful of any suggestions as to how to get rid of all these commits. Could I ask that any answers use revision SHA1s rather than branch names. I thought that somehow I'd be able to revert the merge into A7 but can't quite work out how to do it I hope that is sufficient information. Many thx Simon

    Read the article

  • Design-time failure: WPF, Usercontrols and Namespaces

    - by Simon Woods
    Hi I have a very simple WPF project comprising a Window and Usercontrol. I'm very much in a learning phase. It works fine when I run it. However, I am unable to see the form in design time. The problem, I believe is something to do with namespaces, but I don't understand where. It may well be a simple error Main Window XML <Window xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:d="http://schemas.microsoft.com/expression/blend/2008" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" xmlns:views="clr-namespace:UserLogin" x:Class="UserLogin.MainView" x:Name="MainViewWindow" mc:Ignorable="d" Title="Login" Height="141" Width="347" > <Grid> <views:LoginView /> </Grid> </Window> Main Window CodeBehind Imports Microsoft.VisualBasic Imports System Imports System.Windows Imports UserLogin Namespace UserLogin Partial Public Class MainView Inherits System.Windows.Window Public Sub New() InitializeComponent() End Sub End Class End Namespace Usercontrol XAML <UserControl xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" xmlns:d="http://schemas.microsoft.com/expression/blend/2008" x:Class="UserLogin.LoginView" x:Name="LoginViewControl" mc:Ignorable="d" d:DesignHeight="96" d:DesignWidth="298"> <Grid Height="96" Width="298"> <Button Command="{Binding OKCommand}" Height="21" Margin="0,0,90,16" Name="btnOK" VerticalAlignment="Bottom" HorizontalAlignment="Right" Width="76">OK</Button> <Button Command="{Binding CancelCommand}" Height="21" HorizontalAlignment="Right" Margin="0,0,9,16" Name="btnCancel" VerticalAlignment="Bottom" Width="75">Cancel</Button> <Label Height="23" HorizontalAlignment="Left" Margin="10,5,0,0" Name="Label1" VerticalAlignment="Top" Width="85">Name:</Label> <Label HorizontalAlignment="Left" Margin="10,32,0,0" Name="Label2" Width="85" Height="29" VerticalAlignment="Top">Password:</Label> <TextBox Margin="0,31,6,0" Name="txtPassword" Height="22" VerticalAlignment="Top" HorizontalAlignment="Right" Width="182" /> <ComboBox Height="22" Margin="110,6,6,0" Name="cboNames" VerticalAlignment="Top" /> </Grid> </UserControl> Usercontrol CodeBehind Imports Microsoft.VisualBasic Imports System Imports UserLogin Namespace UserLogin Partial Public Class LoginView Inherits System.Windows.Controls.UserControl Public Sub New() InitializeComponent() End Sub End Class End Namespace I think I'm missing something this namespace xmlns:views="clr-namespace:UserLogin" since intellisense doesn't give me the usercontrol declared within it in the XAML designer but rather reports the error "Unable to load the metadata for the assembly ... etc etc" Thx for any suggestions Simon

    Read the article

  • Creating a list of most popular posts of the past week - Wordpress

    - by Gary Woods
    I have created a widget for my Wordpress platform that displays the most popular posts of the week. However, there is an issue with it. It counts the most popular posts from Monday, not the past 7 days. For instance, this means that on Tuesday, it will only include posts from Tuesday and Monday. Here is my widget code: <?php class PopularWidget extends WP_Widget { function PopularWidget(){ $widget_ops = array('description' => 'Displays Popular Posts'); $control_ops = array('width' => 400, 'height' => 300); parent::WP_Widget(false,$name='ET Popular Widget',$widget_ops,$control_ops); } /* Displays the Widget in the front-end */ function widget($args, $instance){ extract($args); $title = apply_filters('widget_title', empty($instance['title']) ? 'Popular This Week' : $instance['title']); $postsNum = empty($instance['postsNum']) ? '' : $instance['postsNum']; $show_thisweek = isset($instance['thisweek']) ? (bool) $instance['thisweek'] : false; echo $before_widget; if ( $title ) echo $before_title . $title . $after_title; ?> <?php $additional_query = $show_thisweek ? '&year=' . date('Y') . '&w=' . date('W') : ''; query_posts( 'post_type=post&posts_per_page='.$postsNum.'&orderby=comment_count&order=DESC' . $additional_query ); ?> <div class="widget-aligned"> <h3 class="box-title">Popular Articles</h3> <div class="blog-entry"> <ol> <?php if (have_posts()) : while (have_posts()) : the_post(); ?> <li><h4 class="title"><a href="<?php the_permalink(); ?>"><?php the_title(); ?></a></h4></li> <?php endwhile; endif; wp_reset_query(); ?> </ol> </div> </div> <!-- end widget-aligned --> <div style="clear:both;"></div> <?php echo $after_widget; } /*Saves the settings. */ function update($new_instance, $old_instance){ $instance = $old_instance; $instance['title'] = stripslashes($new_instance['title']); $instance['postsNum'] = stripslashes($new_instance['postsNum']); $instance['thisweek'] = 0; if ( isset($new_instance['thisweek']) ) $instance['thisweek'] = 1; return $instance; } /*Creates the form for the widget in the back-end. */ function form($instance){ //Defaults $instance = wp_parse_args( (array) $instance, array('title'=>'Popular Posts', 'postsNum'=>'','thisweek'=>false) ); $title = htmlspecialchars($instance['title']); $postsNum = htmlspecialchars($instance['postsNum']); # Title echo '<p><label for="' . $this->get_field_id('title') . '">' . 'Title:' . '</label><input class="widefat" id="' . $this->get_field_id('title') . '" name="' . $this->get_field_name('title') . '" type="text" value="' . $title . '" /></p>'; # Number of posts echo '<p><label for="' . $this->get_field_id('postsNum') . '">' . 'Number of posts:' . '</label><input class="widefat" id="' . $this->get_field_id('postsNum') . '" name="' . $this->get_field_name('postsNum') . '" type="text" value="' . $postsNum . '" /></p>'; ?> <input class="checkbox" type="checkbox" <?php checked($instance['thisweek'], 1) ?> id="<?php echo $this->get_field_id('thisweek'); ?>" name="<?php echo $this->get_field_name('thisweek'); ?>" /> <label for="<?php echo $this->get_field_id('thisweek'); ?>"><?php esc_html_e('Popular this week','Aggregate'); ?></label> <?php } }// end AboutMeWidget class function PopularWidgetInit() { register_widget('PopularWidget'); } add_action('widgets_init', 'PopularWidgetInit'); ?> How can I change this script so that it will count the past 7 days rather than posts from last Monday?

    Read the article

  • Using IvyDE with different workspaces on different branches

    - by James Woods
    I am having problems using IvyDE when I have different workspaces for different branches. I have "Resolve dependencies in workspace" switched on. But everytime I change to a different workspace I have to remember to manually clean the caches out. This is because IvyDE always uses the default cache for resolving dependencies within a workspace, so when switching between workspaces the cache can be polluted by different versions. It would seem that it is impossible to work with two different workspaces at the same time. I cannot find a way to configure the location that IvyDE uses to cache the project dependencies. It does not appear to use the caches defined in the ivysettings.xml

    Read the article

  • GEdit/Python execution plugin?

    - by Simon Woods
    Hi I'm just starting out learning python with GEdit plus various plugins as my IDE. Visual Studio/F# has a feature which permits the highlighting on a piece of text in the code window which then, on a keypress, gets executed in the F# console. Is there a similar facility/plugin which would enable this sort of behaviour for GEdit/Python? I do have various execution type plugins (Run In Python,Better Python Console) but they don't give me this particular behaviour - or at least I'm not sure how to configure them to give me this. I find it useful because in learning python, I have some test code I want to execute particular individual lines or small segments of code (rather then a complete file) to try and understand what they are doing (and the copy/paste can get a bit tiresome) ... or perhaps there is a better way to do code exploration? Many thx Simon

    Read the article

  • Insert xelements using LINQ Select?

    - by Simon Woods
    I have a source piece of xml into which I want to insert multiple elements which are created dependant upon certain values found in the original xml At present I have a sub which does this for me: <Extension()> Public Sub AddElements(ByVal xml As XElement, ByVal elementList As IEnumerable(Of XElement)) For Each e In elementList xml.Add(e) Next End Sub And this is getting invoked in a routine as follows: Dim myElement = New XElement("NewElements") myElement.AddElements( xml.Descendants("TheElements"). Where(Function(e) e.Attribute("FilterElement") IsNot Nothing). Select(Function(e) New XElement("NewElement", New XAttribute("Text", e.Attribute("FilterElement").Value)))) Is it possible to re-write this using Linq syntax so I don't need to call out to the Sub AddElements but could do it all in-line Many Thx Simon

    Read the article

  • Android Dev: The constructor Intent(new View.OnClickListener(){}, Class<DrinksTwitter>) is undefined

    - by Malcolm Woods Spark
    package com.android.drinksonme; import android.app.Activity; import android.content.Intent; import android.os.Bundle; import android.view.View; import android.view.View.OnClickListener; import android.widget.Button; import android.widget.EditText; import android.widget.TextView; public class Screen2 extends Activity { // Declare our Views, so we can access them later private EditText etUsername; private EditText etPassword; private Button btnLogin; private Button btnSignUp; private TextView lblResult; @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); // Get the EditText and Button References etUsername = (EditText)findViewById(R.id.username); etPassword = (EditText)findViewById(R.id.password); btnLogin = (Button)findViewById(R.id.login_button); btnSignUp = (Button)findViewById(R.id.signup_button); lblResult = (TextView)findViewById(R.id.result); // Set Click Listener btnLogin.setOnClickListener(new OnClickListener() { @Override public void onClick(View v) { // Check Login String username = etUsername.getText().toString(); String password = etPassword.getText().toString(); if(username.equals("test") && password.equals("test")){ final Intent i = new Intent(this, DrinksTwitter.class); //error on this line startActivity(i); // lblResult.setText("Login successful."); } else { lblResult.setText("Invalid username or password."); } } }); final Intent k = new Intent(Screen2.this, SignUp.class); btnSignUp.setOnClickListener(new OnClickListener() { public void onClick(View v) { startActivity(k); } }); } }

    Read the article

  • Get Exchange Online Mailbox Size in GB

    - by Brian Jackett
    As mentioned in my previous post I was recently working with a customer to get started with Exchange Online PowerShell commandlets.  In this post I wanted to follow up and show one example of a difference in output from commandlets in Exchange 2010 on-premises vs. Exchange Online.   Problem    The customer was interested in getting the size of mailboxes in GB.  For Exchange on-premises this is fairly easy.  A fellow PFE Gary Siepser wrote an article explaining how to accomplish this (click here).  Note that Gary’s script will not work when remoting from a local machine that doesn’t have the Exchange object model installed.  A similar type of scenario exists if you are executing PowerShell against Exchange Online.  The data type for TotalItemSize  being returned (ByteQuantifiedSize) exists in the Exchange namespace.  If the PowerShell session doesn’t have access to that namespace (or hasn’t loaded it) PowerShell works with an approximation of that data type.    The customer found a sample script on this TechNet article that they attempted to use (minor edits by me to fit on page and remove references to deleted item size.)   Get-Mailbox -ResultSize Unlimited | Get-MailboxStatistics | Select DisplayName,StorageLimitStatus, ` @{name="TotalItemSize (MB)"; expression={[math]::Round( ` ($_.TotalItemSize.Split("(")[1].Split(" ")[0].Replace(",","")/1MB),2)}}, ` ItemCount | Sort "TotalItemSize (MB)" -Descending | Export-CSV "C:\My Documents\All Mailboxes.csv" -NoTypeInformation     The script is targeted to Exchange 2010 but fails for Exchange Online.  In Exchange Online when referencing the TotalItemSize property though it does not have a Split method which ultimately causes the script to fail.   Solution    A simple solution would be to add a call to the ToString method off of the TotalItemSize property (in bold on line 5 below).   Get-Mailbox -ResultSize Unlimited | Get-MailboxStatistics | Select DisplayName,StorageLimitStatus, ` @{name="TotalItemSize (MB)"; expression={[math]::Round( ` ($_.TotalItemSize.ToString().Split("(")[1].Split(" ")[0].Replace(",","")/1MB),2)}}, ` ItemCount | Sort "TotalItemSize (MB)" -Descending | Export-CSV "C:\My Documents\All Mailboxes.csv" -NoTypeInformation      This fixes the script to run but the numerous string replacements and splits are an eye sore to me.  I attempted to simplify the string manipulation with a regular expression (more info on regular expressions in PowerShell click here).  The result is a workable script that does one nice feature of adding a new member to the mailbox statistics called TotalItemSizeInBytes.  With this member you can then convert into any byte level (KB, MB, GB, etc.) that suits your needs.  You can download the full version of this script below (includes commands to connect to Exchange Online session). $UserMailboxStats = Get-Mailbox -RecipientTypeDetails UserMailbox ` -ResultSize Unlimited | Get-MailboxStatistics $UserMailboxStats | Add-Member -MemberType ScriptProperty -Name TotalItemSizeInBytes ` -Value {$this.TotalItemSize -replace "(.*\()|,| [a-z]*\)", ""} $UserMailboxStats | Select-Object DisplayName,@{Name="TotalItemSize (GB)"; ` Expression={[math]::Round($_.TotalItemSizeInBytes/1GB,2)}}   Conclusion    Moving from on-premises to the cloud with PowerShell (and PowerShell remoting in general) can sometimes present some new challenges due to what you have access to.  This means that you must always test your code / scripts.  I still believe that not having to physically RDP to a server is a huge gain over some of the small hurdles you may encounter during the transition.  Scripting is the future of administration and makes you more valuable.  Hopefully this script and the concepts presented help you be a better admin / developer.         -Frog Out     Links The Get-MailboxStatistics Cmdlet, the TotalitemSize Property, and that pesky little “b” http://blogs.technet.com/b/gary/archive/2010/02/20/the-get-mailboxstatistics-cmdlet-the-totalitemsize-property-and-that-pesky-little-b.aspx   View Mailbox Sizes and Mailbox Quotas Using Windows PowerShell http://technet.microsoft.com/en-us/exchangelabshelp/gg576861#ViewAllMailboxes   Regular Expressions with Windows PowerShell http://www.regular-expressions.info/powershell.html   “I don’t always test my code…” image http://blogs.pinkelephant.com/images/uploads/conferences/I-dont-always-test-my-code-But-when-I-do-I-do-it-in-production.jpg   The One Thing: Brian Jackett and SharePoint 2010 http://www.youtube.com/watch?v=Sg_h66HMP9o

    Read the article

  • Hard to append a table with many records into another without generating duplicates

    - by Bill Mudry
    I may seem to be a bit wordy at first but for the hope it will be easier for all of you to understand what I am doing in the first place. I have an uncommon but enjoyable activity of collecting as many species of wood from around the world as I can (over 2,900 so far). Ok, that is the real world. Meanwhile I have spent over 8 years compiling over 5.8 meg of text data on all the woods of the world. That got so large that learning some basic PHP and MySQL was most welcome so I could build a new database driven home for all this research. I am still slow at it but getting there. The original premise was to find evidence of as many species of woods in the world I can. The more names identified, the more successful the project. I have named the project TAXA for ease of conversation (short for Taxonomy). You are most welcome to take a look at what I have so far at www.prowebcanada.com/taxa. It is 95% dynamically driven. So far I am reporting about 6,500 botanical wood names and, as said above, the more I can report, the more successful is the project. I have a file of all the woods in the second largest wood collection in the world, the Tervuren wood collection in the Netherlands with over 11,300 wood names even after cleaning out all duplicates. That is almost twice the number I am reporting now so porting all the new wood names from Tervuren to the 'species' table where I keep the reported data would be a major desirable advancement in the project. At one point I was able to add all the Tervuren records to the species table but over 3,000 duplicates also formed. They were not in the Tervuren file in the first place but represent the same wood names common to both files. It is common sense that there would be woods common to both that when merged would create new duplicates. At one point and with the help of others from another forum, I may very well have finally got the proper SQL statement. When I ran it, though, the system said (semi-amusingly at first) ----- that it had gone away! After looking up on the Net what could have have done this, one reason is that the MySQL timeout lapses and probably because of the large size of files I am running. I am running this on a rented account on Godaddy so I cannot go about trying to adjust any config file. For safety, I copied the tervuren.sql file as tervuren_target.sql and the species.sql file as species_master.sql tp use as working files just to make sure I protect the original files from destruction or damage. Later I can name the species_master back to just species.sql once I am happy all worked well. The species file has about 18 columns in it but only 5 columns match the columns in the Tervuren file (name for name and collation also). The rest of the columns are just along for the ride, so to speak. The common key in both is the 'species_name" columns in both. I am not sure it is at all proper to call one a primary key and the other a foreign key since there really is no relational connection to them. One is just more data for the other and can disappear after, never to be referred to the working code in the application. I have been very surprised and flabbergasted on how hard it can be to append records from one large table into another (with same column names plus others) without generating NEW duplicates in the first place. Watch out thinking that a SELECT DISTINCT statement may do the job because absolutely NO records in the species table must get destroyed in the process and there is no way (well, that I know of) to tell the 'DISTINCT" command this. Yes, the original 'species' table has duplicates in it even before all this but, trust me ---- they have to be removed the long hard way manually record by record or I will lose precious information. It is more important to just make sure no NEW duplicates form through bringing in new names in the tervuren_target.species_name into species.species_name. I am hoping and thinking that a straight SQL solution should work --- except for that nasty timeout. How do I get past that? Could it mean that I may have to turn to a PHP plus SQL method?? Or ..... would I have to break up the Tervuren files into a few smaller ones and run them independently (hope not....)" So far, what seems should be easy has proven to be unexpectedly tricky. I appreciate any help you can give but start from the assumption that this may be harder to do right than it may seem on the surface. By the way --- I am running a quad 64 bit system with Windows 7, so at least I have some fairly hefty power on the client end. I have a direct ethernet cable feeding a cable connection to the Internet. Once I get an algorithm and code working for this, I also have many other lists to process that could make the 'species' table grow even more. It could be equivalent to (ahem) lighting a rocket under my project (especially compared to do this record by record manually)! This is my first time in this forum, so I do not know how I can receive any replies. Do I have to to come back here periodically or are replies emailed out also? It would be great if you CC'd copies to me at billmudry at rogers.com :-) Much thanks for your patience and help, Bill Mudry Mississauga, Ontario Canada (next to Toronto).

    Read the article

  • SharePoint 2010 Design & Deployment Best Practices

    - by Michael Van Cleave
    Well now that SharePoint 2010 has successfully launched and everyone is scratching for every piece of best practices information they can get their hands on, I would like to invite anyone and everyone to come and take part in ShareSquared's next webinar. The webinar will cover some key information such as: Pros and cons of the different approaches to installing and configuring SharePoint 2010 Configuration Best Practices for SharePoint 2010 farms Services architecture; dependencies, licensing, and topologies Information Architecture guidance for sizing, multilingual support, multi-tenancy, and more. Using tools such as SharePoint Composer and SharePoint Maestro to configure and deploy SharePoint 2010 And most of all, avoiding common pitfalls for installation and deployment. What is better than all of that? Well, the even more exciting thing is that the presenters will be our very own SharePoint MVP's Gary Lapointe and Paul Stork. If you don't know who these guys are then you should definitely check out their blogs and their contributions to the SharePoint community. To get more information and register click here: REGISTER Other great links to information in this post: ShareSquared, Inc Gary Lapointe's Blog Paul Stork's Blog SharePoint Composer Check it out and get up to speed from some of the best in the industry. Michael

    Read the article

  • links for 2011-02-03

    - by Bob Rhubart
    Webcast: Reduce Complexity and Cost with Application Integration and SOA Speakers: Bruce Tierney (Product Director, Oracle Fusion Middleware) and Rajendran Rajaram (Oracle Technical Consultant). Thursday, February 17, 2011. 10 a.m. PT/1 p.m. ET. (tags: oracle otn soa fusionmiddleware) William Vambenepe: The API, the whole API and nothing but the API William asks: "When programming against a remote service, do you like to be provided with a library (or service stub) or do you prefer 'the API, the whole API, nothing but the API?'" (tags: oracle otn API webservices soa) Gary Myers: Fluffy white Oracle clouds by the hour Gary says: "Pay-by-the-hour options are becoming more common, with Amazon and Oracle are getting even more intimate in the next few months. Yes, you too will be able to pay for a quickie with the king of databases (or queen if you prefer that as a mental image). " (tags: oracle otn cloudcomputing amazon ec2) Conversation as User Assistance (the user assistance experience) "To take advantage of the conversations on the web as user assistance, enterprises must first establish where on the spectrum their community lies." -- Ultan O'Broin (tags: oracle otn enterprise2.0 userexperience) Webcast: Oracle WebCenter Suite – Giving Users a Modern Experience Thursday, February 10, 2011. 11 a.m. PT/2 p.m. ET. Speakers: Vince Casarez, Vice President of Enterprise 2.0 Product Management, Oracle; Erin Smith, Consulting Practice Manager – Portals, Oracle; Robert Wessa, Consulting Technical Director,  Enterprise 2.0 Infrastructure, Oracle.  (tags: oracle otn enterprise2.0 webcenter)

    Read the article

  • Oh snap! My RPi was upgraded to 512MB! Woo-hoo!

    - by hinkmond
    I ordered a Raspberry Pi Model B (256MB) over 4 months ago on backorder. When it finally came I saw it was upgraded to the new half a gig model! Woot! But, all was not perfect. Gary C. told me the shipped configuration of the new RPi models didn't have the right firmware for 512MB, and I had to upgrade the start.elf in the /boot directory to recognize all of the 512MB RAM. I did a "free" command, and sure enough saw only 240MB. Sadness. But, Gary gave me a copy of his start.elf which worked after some trail and error. For anyone ordering the new RPi Model B w/512MB, here are the steps to get you going with full 512MB RAM: sudo apt-get update --fix-missing sudo apt-get upgrade --fix-missing # NOTE: This step takes at least a couple hours on a # fast network wget https://raw.github.com/raspberrypi/firmware/\ 164b0fe2b3b56081c7510df93bc1440aebe45f7e/boot/\ arm496_start.elf sudo mv /boot/start.elf /boot/orig-start.elf sudo mv arm496_start.elf /boot/start.elf sudo reboot free total used free shared buffers cached Mem: 497768 210596 287172 0 16892 169624 -/+ buffers/cache: 24080 473688 Swap: 102396 0 102396 So of course this means... (drumroll) there is now 498MB available for the Java Embedded heap! java -Xmx400m -version java version "1.7.0_06" Java(TM) SE Embedded Runtime Environment (build 1.7.0_06-b24, headless) Java HotSpot(TM) Embedded Client VM (build 23.2-b09, mixed mode) Yeah, baby! Hinkmond

    Read the article

  • MVPs and the Community

    - by andyleonard
    Introduction Earlier this month, David Woods decided to drop his MVP award . The move inspired some interesting comments and discussion among MVPs. David's points are: MVP Expertise Microsoft Technology Products Microsoft "Listens" Cost-Benefits for an MVP MVP Expertise After mentioning he's encountered MVPs who are not experts, David states: "The way you get in is by contributing to the community." Honestly, I don't know the specifics of how someone becomes a Microsoft MVP . And I'm ok with that....(read more)

    Read the article

  • Locating Rogue Perl Script

    - by Gary Garside
    I've been trying to source the location of a perl script which is causing havoc on a server which i control. I'm also trying to find out exactly how this script was installed on the server - my best guess is through a wordpress exploit. The server is a basic web setup running Ubuntu 9.04, Apache and MySQL. I use IPTables for firewall, the site runs around 20 sites and the load never really creeps above 0.7. From what i can see the script is making outbound connection to other servers (most likely trying to brute force entry). Here is a top dump of one of the processes: PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 22569 www-data 20 0 22784 3216 780 R 100 0.2 47:00.60 perl The command the process is running is /usr/sbin/sshd . I've tried to find an exact file name but im having no luck... i've ran a lsof -p PID and here is the output: COMMAND PID USER FD TYPE DEVICE SIZE NODE NAME perl 22569 www-data cwd DIR 8,6 4096 2 / perl 22569 www-data rtd DIR 8,6 4096 2 / perl 22569 www-data txt REG 8,6 10336 162220 /usr/bin/perl perl 22569 www-data mem REG 8,6 26936 170219 /usr/lib/perl/5.10.0/auto/Socket/Socket.so perl 22569 www-data mem REG 8,6 22808 170214 /usr/lib/perl/5.10.0/auto/IO/IO.so perl 22569 www-data mem REG 8,6 39112 145112 /lib/libcrypt-2.9.so perl 22569 www-data mem REG 8,6 1502512 145124 /lib/libc-2.9.so perl 22569 www-data mem REG 8,6 130151 145113 /lib/libpthread-2.9.so perl 22569 www-data mem REG 8,6 542928 145122 /lib/libm-2.9.so perl 22569 www-data mem REG 8,6 14608 145125 /lib/libdl-2.9.so perl 22569 www-data mem REG 8,6 1503704 162222 /usr/lib/libperl.so.5.10.0 perl 22569 www-data mem REG 8,6 135680 145116 /lib/ld-2.9.so perl 22569 www-data 0r FIFO 0,6 157216 pipe perl 22569 www-data 1w FIFO 0,6 197642 pipe perl 22569 www-data 2w FIFO 0,6 197642 pipe perl 22569 www-data 3w FIFO 0,6 197642 pipe perl 22569 www-data 4u IPv4 383991 TCP outsidesoftware.com:56869->server12.34.56.78.live-servers.net:www (ESTABLISHED) My gut feeling is outsidesoftware.com is also under attacK? Or possibly being used as a tunnel. I've managed to find a number of rouge files in /tmp and /var/tmp, here is a brief output of one of these files: #!/usr/bin/perl # this spreader is coded by xdh # xdh@xxxxxxxxxxx # only for testing... my @nickname = ("vn"); my $nick = $nickname[rand scalar @nickname]; my $ircname = $nickname[rand scalar @nickname]; #system("kill -9 `ps ax |grep httpdse |grep -v grep|awk '{print $1;}'`"); my $processo = '/usr/sbin/sshd'; The full file contents can be viewed here: http://pastebin.com/yenFRrGP Im trying to achieve a couple of things here... Firstly i need to stop these processes from running. Either by disabling outbound SSH or any IP Tables rules etc... these scripts have been running for around 36 hours now and my main concern is to stop these things running and respawning by themselves. Secondly i need to try and source where and how these scripts have been installed. If anybody has any advise on what to look for in access logs or anything else i would be really grateful. Thanks in advance

    Read the article

  • Changing default gateway on workstations connected to Windows Domain SBS server

    - by Gary B2312321321
    We have xp workstations connected onto a small business server acting as active directory/isa firewall/proxy (no dhcp). Is there a reason that after installing a 2nd firewall on the network (same subnet etc), that changing the default gateway on the workstations isnt sufficient to route inet traffic through the new firewall? A freshly setup linux box connects straight on to the alternate firewall with just ip, default gateway. dns settings. Will having ISA still active on the network confuse the process? Are there further config settings deeper down in windows that need attention? Any ideas pointers on this would be appreciated? Other info: Firewalls tried: Smoothwall and Ipcop; small ethernet netwoork 40 pcs; can ping to new firwalls from workstations; activating web proxy on new firewall and reconfiguring workstation browser works fine; Point of 2nd firewall is lack of some necessary features on ISA for a linux app; Would be nice to have some redundancy to though

    Read the article

  • How to stop Excel 2003 from loading a Gazillion files

    - by Gary M. Mugford
    One of my soon-to-be-ex-friends got an Excel file from another friend of his and decided to click on it. It started opening all kinds of files from within Excel. Over 200 and still counting when he called me. I told him to go to task manager, which showed a LOT of files in the applications tab, but only one Excel.exe in the processes tab. Closing it down there, closed down Excel. I then CrossLooped in to see if I could give him a helping hand. Each time Excel was re-opened, the mass influx of files started. They were all kinds of files, PDFs, Docs, JPGs, even some spreadsheets. It looked like the end of solitaire, with multiple windows opening (XP) and the counter on the lone Excel button on the task bar counting off the files. I did the task manager exit routine and went looking for temp files. I CrapCleaned out the system. Made sure I went through the files created in the last hour and deleted anything with a temp anywhere in it. I also deleted the crappy infected/corrupted file from it's place on the desktop (yeah, I know, I yelled for 15 minutes on THAT subject). Despite a delousing, the restart of Excel, which complained of a deactivated add-in, would start the cascading windows, whether I answered yes or no to that question. Yes, it knew it had a serious crash, but why would it just keep on trying to load the bad file, even when I got rid of it? But here's the real question. WHERE was it loading from? I went through the backup folder and NOTHING was there! So what's the process for starting Excel WITHOUT it trying to do a crash recovery? Sort of makes me feel stupid at times. Thanks for any light you can shed on this issue. GM

    Read the article

  • Compiling Mono on Fedora 7

    - by Gary
    Trying to install Mono from source on Fedora 7.. running # ./configure --prefix=/opt/mono works fine, but doing the make # make ; make install ends up with the following: Makefile:93: warning: overriding commands for target `csproj-local' ../build/executable.make:131: warning: ignoring old commands for target `csproj-local' make install-local make[6]: Entering directory `/opt/mono-2.6.4/mcs/mcs' Makefile:93: warning: overriding commands for target `csproj-local' ../build/executable.make:131: warning: ignoring old commands for target `csproj-local' MCS [basic] mcs.exe typemanager.cs(2047,40): error CS0103: The name `CultureInfo' does not exist in the context of `Mono.CSharp.TypeManager' Compilation failed: 1 error(s), 0 warnings make[6]: *** [../class/lib/basic/mcs.exe] Error 1 make[6]: Leaving directory `/opt/mono-2.6.4/mcs/mcs' make[5]: *** [do-install] Error 2 make[5]: Leaving directory `/opt/mono-2.6.4/mcs/mcs' make[4]: *** [install-recursive] Error 1 make[4]: Leaving directory `/opt/mono-2.6.4/mcs' make[3]: *** [profile-do--basic--install] Error 2 make[3]: Leaving directory `/opt/mono-2.6.4/mcs' make[2]: *** [profiles-do--install] Error 2 make[2]: Leaving directory `/opt/mono-2.6.4/mcs' make[1]: *** [install-exec] Error 2 make[1]: Leaving directory `/opt/mono-2.6.4/runtime' make: *** [install-recursive] Error 1 I've been following the instructions at http://ruakuu.blogspot.com/2008/06/installing-and-configuring-opensim-on.html. This is all in an effort to get OpenSimulator running.

    Read the article

  • Mouse click-focus wanders in vmPlayer 3.0 dual-monitor

    - by Gary M. Mugford
    Previously, a WinXPSP3 session running on a WinXPSP3 host computer ran perfectly fine in a dual monitor setup. No issues with vmPlayer 2.x. BEFORE updating to vmPlayer 3, the following problem cropped up. When clicking in a single monitor, you would get exactly what you expected. However, when the display was stretched across two monitors, the clicking would be to the left of the mouse cursor. The farther RIGHT you were, the farther left the click would occur. In other words, if you clicked on the system menu of a window in the upper left of a window on the left monitor, you would get the system menu. Move half a screen to your right and the click would be on an item about a quarter of the way over, rather than where you were clicking. And by going all the way to the far right of the right monitor, you could bring up a right-click menu on the far right of the LEFT monitor. I Hope I have described this properly. It's confusing, even in words. In single monitor mode, everything works perfectly fine. If, instead of using either UltraMon or DisplayFusion, you run a single desktop across both monitors (3200x1600), there are no mousing issues. Unfortunately, having two 1600x1200 monitors, that depth of 1600 makes that hack less than useable. My graphic card won't offer anything resembling 3200x1200. vmPlayer 3.0 did not alleviate the situation. The microsoft mouse drivers are up to date and so are the nVidia card drivers. Any ideas?

    Read the article

  • How to setup equivalent USVIDEO.ORG DNS-Proxy on Linux

    - by Gary
    I have a VPS in the USA running Ubuntu. I want to setup something similar to http://www.usvideo.org Basically, USVIDEO is a DNS service that allows Canadians to access American content like Hulu, Netflix, NBC, and etc (restricted by geographical IP). Here is how I think USVideo does it: Clients (PS3, XBOX, PC) specifies the DNS server(s) as specified on USVIDEO.org's website. If the DNS request is a video/audio site such as Netflix or Pandora, forward the request to a proxy. Otherwise, for all other requests, forward it to a different DNS server. If the specific video/audio URL is requested, return the address of the proxy server, which in turn relays traffic to the destination video/audio domain via the U.S. gateway so that it appears that the access is coming from a U.S. IP address. Once the DNS request has passed the U.S. IP address check, their proxy server steps out of the loop and lets the video streaming site contact you directly to start the video stream. This trick relies on the way that the video streaming sites check the country of your IP address once up front, but don't actually check the country of the destination IP address while the video is streaming. What is elegant about this solution is that a VPN Tunnel is not required to bypass geographical IP checks from certain websites. All that is required on the client side is to specify the DNS server (the VPS). If a certain site is geographically locked, just forward the traffic to a proxy, and that's it. These sites can be specified in the DNS entries, or perhaps in the proxy service to redirect the DNS request to its own proxy. I believe what I need to setup something similar is Squid Proxy, IPTables, and DNS. What I need help is how to exactly approach this? Would Squid Proxy be setup as a transparent proxy?

    Read the article

  • How to black out second monitor when playing a full screen game?

    - by Gary
    I've got two monitors. When I play a full screen game in my primary monitor, the second monitor still shows the Windows desktop. How can I black out the second monitor so that it doesn't show anything at all? My graphics card is from ATI. I'm running Windows 7 in Boot Camp on an iMac. I don't know if this makes a difference, however, as any solution to my question will do, really. For instance, is there a tiny app that I could run to disable the second monitor for a while? I'd rather not have to open Windows display settings every time I run a game and disable the monitor that way, though.

    Read the article

  • Is On-The-Fly string replacement possible using GreaseMonkey and Firefox

    - by Gary M. Mugford
    I have looked for means to stop Brightcove videos from autostarting in Firefox and have come to the conclusion it isn't possible without external programming via something like Grease Monkey. However, I'm not proficient in javascript let alone GM. So I thought I'd ask here first whether what I want to do is feasible, or whether it's a fool's errand. What I want to accomplish is have a site specific script executed to replace a string value on the run in that site's code. Specifically, what I am looking for is something GM-style that would do this: if site_domain = 'www.SiteWithAutoPlayVideos.com' then replace_all('<param name="autoStart" value="true" />', '<param name="autoStart" value="false" />'); Having looked through Super User for anything GreaseMonkey that might relate, I see notices that the sandbox GM executes scripts in has to remain separate for security reasons. So, I suspect I might be in for disappointment. BUT if it is accomplishable and somebody here can confirm it, then I will do my best to struggle through the learning curve and get this noisome little problem put to rest. Yes, I have tried Flash Block and FlashDisable in order to attack this issue with no avail. Thanks in advance for your time.

    Read the article

  • Add Route for machine in same DC

    - by gary
    My routing table on my machine with IP of 46.84.121.243 currently looks like this - Network Destination Netmask Gateway Interface Metric 0.0.0.0 0.0.0.0 46.84.121.225 46.84.121.243 21 46.84.121.224 255.255.255.224 On-link 46.84.121.243 276 46.84.121.239 255.255.255.255 On-link 46.84.121.243 21 46.84.121.243 255.255.255.255 On-link 46.84.121.243 276 46.84.121.255 255.255.255.255 On-link 46.84.121.243 276 I'm trying to access 46.84.121.239, which is my other machine in the same DC but my guess is the first rule is blocking it as it is trying to go via the gateway and failing - Tracing route to [46.84.121.239] over a maximum of 30 hops: 1 OWNEROR-9O83HBL [46.84.121.243] reports: Destination host unreachable. Trace complete. I'm doing all this via RDP and already tried changing the metric on the persistent rule with devastating consequences! Here's the persistent rule (working) - Persistent Routes: Network Address Netmask Gateway Address Metric 0.0.0.0 0.0.0.0 46.84.121.225 1 Any help to be able to access the 46.84.121.243 would be very helpful thanks very much.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >