Search Results

Search found 587 results on 24 pages for 'docking station'.

Page 10/24 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • How do I mount this HDD?

    - by Casval Deikun
    I have a Windows HDD attached to an external docking bay on my Ubuntu System, but the HDD brings up an error message: Mounting exited with exit code 14:windows is hibernated, refused to mount. Failed to mount '/dev/sde2':Operation not permitted. The NTFS partition is hibernated. Please resume and shutdown windows properly or mount the volume read-only with the 'ro' mount option, or mount the volume read-write with the 'remove_hiberfile' mount option. For example type in command line: mount -t ntfs-3g-o remove_hiberfile/dev/sde2/media/FE46D60C46D5C615 I am at a loss for exactly how to remove the hiberfile, or even mount it as read-only or read-write. I tried directly copy and pasting that exact command into my terminal, but it said: mount:only root can do that I do not know what to do at this point. I do need to get the information off of this drive, but I do not have a computer to put it in. Does anyone know what I should do from here?

    Read the article

  • Recover files from a SATA drive from USB

    - by Shane
    I've damaged the drive I had in my windows laptop and now I want to try to recover as many files as possible. I know very little about linux though. I have Ubuntu 10.04. I have a docking station for the drive and it is connected to my linux machine. The drive appears in disk utility. Unfortunately, this is where I have no idea where to proceed. Any help is appreciated and I can provide more info if needed.

    Read the article

  • 'dock' working in simple words

    - by Shirish11
    I would like to know the idea behind the working of 'docking' in applications. I have worked with applications where components from individual forms are docked on a single main form to provide the necessary GUI. But I don't have any idea whats happening in the background. According to Wikipedia A dock is a graphical user interface element that typically provides the user with a way of launching, switching between, and monitoring running programs or applications. Now I am a bit confused if it is some component or an event or a property or something else. EDIT : The applications was developed in Delphi on windows platform. There is something more in Delphi (manual dock and automatic dock).

    Read the article

  • Surface Pro 3 first impressions

    - by John Paul Cook
    I traded in my Surface 2 (the trade-in program is now over) and bought a Surface Pro 3 with an i7 processor and 8 GB of ram. I greatly prefer the 3 by 2 aspect ratio of the Surface 3. After only one day of ownership, I’ve decided to purchase a docking station. I have a 7 year old desktop with a quad core Q6600 processor overclocked to 3.0 GHz and 8 GB of ram. It has a Plextor 512 MB SSD as the primary drive. It’s a very capable machine, but it does have a little bit, and I do mean only a little bit,...(read more)

    Read the article

  • Iomega Home Media Network Hard Drive 1TB Cloud Edition failed, data recovery?

    - by lonbon69
    I have a Iomega Home Media Network Drive Cloud Edition 1TB that started clicking and then displayed a failure LED code Power LED and Red LED. I removed the SATA drive and inserted in a 'All in 1 HDD Docking Station' and connected to laptop by USB - Laptop has Win 7 OS. The dock is seen as drive E but cannot access and says 0% data etc. The drive does spin up when I power the dock. Web searches say the drive has EXT3 file system and to use Ubuntu to access drive. I have now setup a dual boot laptop but still do not see the drive using ubuntu. Is there something else I need to do to get it recognised etc. I really would like to recover the data, any suggestions please?

    Read the article

  • NHibernate unmapped class exception

    - by John Prideaux
    I am trying to implement a one-to-many relationship using NHibernate 2.1.2 but keep getting "Association references unmapped class" exceptions. I have verified that my hbm.xml files are embedded resource. Here are my classes and mappings. Any ideas? public class OrderStatus { public virtual decimal MainCommit { get; set; } public virtual decimal CommitNumber { get; set; } public virtual string InvoiceNumber { get; set; } public virtual string ShipTo { get; set; } public virtual string CustomerOrderNumber { get; set; } public virtual string Station { get; set; } public virtual DateTime RequestedShipDate { get; set; } public virtual decimal EstimatedValue { get; set; } public virtual decimal EstimatedWeight { get; set; } public virtual string Customer { get; set; } public virtual DateTime InvoiceDate { get; set; } public virtual ICollection<Promise> Promises { get; set; } } <class name="AladdinDb.Models.OrderStatus, AladdinDb" table="vorder_status"> <id name="CommitNumber" type="decimal" column="commit_no"> <generator class="assigned"> <param name="property"> Plan </param> </generator> </id> <property name="MainCommit" column="main_commit" type="decimal" /> <property name="InvoiceNumber" column="invoice_no" type="string" /> <property name="ShipTo" column="ship_to" type ="string"/> <property name="CustomerOrderNumber" column="cust_order_no" type="string" /> <property name="Station" column="station" type="string" /> <property name="RequestedShipDate" column="req_ship_date" type="DateTime" /> <property name="EstimatedValue" column="estimated_value" type="decimal"/> <property name="EstimatedWeight" column="estimated_weight" type="decimal" /> <property name="Customer" column="customer" type="string" /> <property name="InvoiceDate" column="invoice_date" /> <set name="Promises"> <key column="commit_no"></key> <one-to-many class="Promise" /> </set> </class> public class Promise { public virtual decimal CommitNumber { get; set; } public virtual DateTime PromiseDate { get; set; } public virtual string WhoAsked { get; set; } public virtual string WhoGave { get; set; } public virtual string Iffy { get; set; } } <class name="AladdinDb.Models.Promise, AladdinDb" table="promise"> <id name="CommitNumber" type="decimal" column="commit_no"> <generator class="assigned" /> </id> <property name="PromiseDate" column="promise_date" /> <property name="WhoAsked" column="who_asked" /> <property name="WhoGave" column="who_gave" /> <property name="Iffy" column="iffy" /> </class>

    Read the article

  • how to insert data if it contain apostrophe ?

    - by angel ansari
    Actally my task is load csv file into sql server using c# so i have split it by comma my problem is that some field's data contain apostrop and i m firing insert query to load data into sql so its give error my coding like that using System; using System.Collections.Generic; using System.ComponentModel; using System.Data; using System.Drawing; using System.Linq; using System.Text; using System.Windows.Forms; using System.IO; using System.Data.SqlClient; namespace tool { public partial class Form1 : Form { StreamReader reader; SqlConnection con; SqlCommand cmd; int count = 0; //int id=0; FileStream fs; string file = null; string file_path = null; SqlCommand sql_del = null; public Form1() { InitializeComponent(); } private void button1_Click(object sender, EventArgs e) { OpenFileDialog file1 = new OpenFileDialog(); file1.ShowDialog(); textBox1.Text = file1.FileName.ToString(); file = Path.GetFileName(textBox1.Text); file_path = textBox1.Text; fs = new FileStream(file_path, FileMode.Open, FileAccess.Read); } private void button2_Click(object sender, EventArgs e) { if (file != null ) { sql_del = new SqlCommand("Delete From credit_debit1", con); sql_del.ExecuteNonQuery(); reader = new StreamReader(file_path); string line_content = null; string[] items = new string[] { }; while ((line_content = reader.ReadLine()) != null) { if (count >=4680) { items = line_content.Split(','); string region = items[0].Trim('"'); string station = items[1].Trim('"'); string ponumber = items[2].Trim('"'); string invoicenumber = items[3].Trim('"'); string invoicetype = items[4].Trim('"'); string filern = items[5].Trim('"'); string client = items[6].Trim('"'); string origin = items[7].Trim('"'); string destination = items[8].Trim('"'); string agingdate = items[9].Trim('"'); string activitydate = items[10].Trim('"'); if ((invoicenumber == "-") || (string.IsNullOrEmpty(invoicenumber))) { invoicenumber = "null"; } else { invoicenumber = "'" + invoicenumber + "'"; } if ((destination == "-") || (string.IsNullOrEmpty(destination))) { destination = "null"; } else { destination = "'" + destination + "'"; } string vendornumber = items[11].Trim('"'); string vendorname = items[12].Trim('"'); string vendorsite = items[13].Trim('"'); string vendorref = items[14].Trim('"'); string subaccount = items[15].Trim('"'); string osdaye = items[16].Trim('"'); string osaa = items[17].Trim('"'); string osda = items[18].Trim('"'); string our = items[19].Trim('"'); string squery = "INSERT INTO credit_debit1" + "([id],[Region],[Station],[PONumber],[InvoiceNumber],[InvoiceType],[FileRefNumber],[Client],[Origin],[Destination], " + "[AgingDate],[ActivityDate],[VendorNumber],[VendorName],[VendorSite],[VendorRef],[SubAccount],[OSDay],[OSAdvAmt],[OSDisbAmt], " + "[OverUnderRecovery] ) " + "VALUES " + "('" + count + "','" + region + "','" + station + "','" + ponumber + "'," + invoicenumber + ",'" + invoicetype + "','" + filern + "','" + client + "','" + origin + "'," + destination + "," + "'" + (string)agingdate.ToString() + "','" + (string)activitydate.ToString() + "','" + vendornumber + "',' " + vendorname + "',' " + vendorsite + "',' " + vendorref + "'," + "'" + subaccount + "','" + osdaye + "','" + osaa + "','" + osda + "','" + our + "') "; cmd = new SqlCommand(squery, con); cmd.CommandTimeout = 1500; cmd.ExecuteNonQuery(); } label2.Text = count.ToString(); Application.DoEvents(); count++; } MessageBox.Show("Process completed"); } else { MessageBox.Show("path select"); } } private void button3_Click(object sender, EventArgs e) { this.Close(); } private void Form1_Load(object sender, EventArgs e) { con = new SqlConnection("Data Source=192.168.50.200;User ID=EGL_TEST;Password=TEST;Initial Catalog=EGL_TEST;"); con.Open(); } } } vendername field contain data (MCCOLLISTER'S TRANSPORTATION) so how to pass this data

    Read the article

  • how to insert data if it contain apostrop ?

    - by angel ansari
    Actally my task is load csv file into sql server using c# so i have split it by comma my problem is that some field's data contain apostrop and i m firing insert query to load data into sql so its give error my coding like that using System; using System.Collections.Generic; using System.ComponentModel; using System.Data; using System.Drawing; using System.Linq; using System.Text; using System.Windows.Forms; using System.IO; using System.Data.SqlClient; namespace tool { public partial class Form1 : Form { StreamReader reader; SqlConnection con; SqlCommand cmd; int count = 0; //int id=0; FileStream fs; string file = null; string file_path = null; SqlCommand sql_del = null; public Form1() { InitializeComponent(); } private void button1_Click(object sender, EventArgs e) { OpenFileDialog file1 = new OpenFileDialog(); file1.ShowDialog(); textBox1.Text = file1.FileName.ToString(); file = Path.GetFileName(textBox1.Text); file_path = textBox1.Text; fs = new FileStream(file_path, FileMode.Open, FileAccess.Read); } private void button2_Click(object sender, EventArgs e) { if (file != null ) { sql_del = new SqlCommand("Delete From credit_debit1", con); sql_del.ExecuteNonQuery(); reader = new StreamReader(file_path); string line_content = null; string[] items = new string[] { }; while ((line_content = reader.ReadLine()) != null) { if (count >=4680) { items = line_content.Split(','); string region = items[0].Trim('"'); string station = items[1].Trim('"'); string ponumber = items[2].Trim('"'); string invoicenumber = items[3].Trim('"'); string invoicetype = items[4].Trim('"'); string filern = items[5].Trim('"'); string client = items[6].Trim('"'); string origin = items[7].Trim('"'); string destination = items[8].Trim('"'); string agingdate = items[9].Trim('"'); string activitydate = items[10].Trim('"'); if ((invoicenumber == "-") || (string.IsNullOrEmpty(invoicenumber))) { invoicenumber = "null"; } else { invoicenumber = "'" + invoicenumber + "'"; } if ((destination == "-") || (string.IsNullOrEmpty(destination))) { destination = "null"; } else { destination = "'" + destination + "'"; } string vendornumber = items[11].Trim('"'); string vendorname = items[12].Trim('"'); string vendorsite = items[13].Trim('"'); string vendorref = items[14].Trim('"'); string subaccount = items[15].Trim('"'); string osdaye = items[16].Trim('"'); string osaa = items[17].Trim('"'); string osda = items[18].Trim('"'); string our = items[19].Trim('"'); string squery = "INSERT INTO credit_debit1" + "([id],[Region],[Station],[PONumber],[InvoiceNumber],[InvoiceType],[FileRefNumber],[Client],[Origin],[Destination], " + "[AgingDate],[ActivityDate],[VendorNumber],[VendorName],[VendorSite],[VendorRef],[SubAccount],[OSDay],[OSAdvAmt],[OSDisbAmt], " + "[OverUnderRecovery] ) " + "VALUES " + "('" + count + "','" + region + "','" + station + "','" + ponumber + "'," + invoicenumber + ",'" + invoicetype + "','" + filern + "','" + client + "','" + origin + "'," + destination + "," + "'" + (string)agingdate.ToString() + "','" + (string)activitydate.ToString() + "','" + vendornumber + "',' " + vendorname + "',' " + vendorsite + "',' " + vendorref + "'," + "'" + subaccount + "','" + osdaye + "','" + osaa + "','" + osda + "','" + our + "') "; cmd = new SqlCommand(squery, con); cmd.CommandTimeout = 1500; cmd.ExecuteNonQuery(); } label2.Text = count.ToString(); Application.DoEvents(); count++; } MessageBox.Show("Process completed"); } else { MessageBox.Show("path select"); } } private void button3_Click(object sender, EventArgs e) { this.Close(); } private void Form1_Load(object sender, EventArgs e) { con = new SqlConnection("Data Source=192.168.50.200;User ID=EGL_TEST;Password=TEST;Initial Catalog=EGL_TEST;"); con.Open(); } } } vendername field contain data (MCCOLLISTER'S TRANSPORTATION) so how to pass this data

    Read the article

  • Master Data Management for Location Data - Oracle Site Hub

    - by david.butler(at)oracle.com
    Most MDM discussions cover key domains such as customer, supplier, product, service, and reference data. It is usually understood that these domains have complex structures and hundreds if not thousands of attributes that need governing. Location, on the other hand, strikes most people as address data. How hard can that be? But for many industries, locations are complex, and site information is critical to efficient operations and relevant analytics. Retail stores and malls, bank branches, construction sites come to mind. But one of the best industries for illustrating the power of a site mastering application is Oil & Gas.   Oracle's Master Data Management solution for location data is the Oracle Site Hub. It is a location mastering solution that enables organizations to centralize site and location specific information from heterogeneous systems, creating a single view of site information that can be leveraged across all functional departments and analytical systems.   Let's take a look at the location entities the Oracle Site Hub can manage for the Oil & Gas industry: organizations, property, land, buildings, roads, oilfield, service center, inventory site, real estate, facilities, refineries, storage tanks, vendor locations, businesses, assets; project site, area, well, basin, pipelines, critical infrastructure, offshore platform, compressor station, gas station, etc. Any site can be classified into multiple hierarchies, like organizational hierarchy, operational hierarchy, geographic hierarchy, divisional hierarchies and so on. Any site can also be associated to multiple clusters, i.e. collections of sites, and these can be used as a foundation for driving reporting, analysis, organize daily work, etc. Hierarchies can also be used to model entities which are structured or non-structured collections of nodes, like for example routes, pipelines and more. The User Defined Attribute Framework provides the needed infrastructure to add single row attributes groups like well base attributes (well IDs, well type, well structure and key characterizing measures, and more) and well geometry, and multi row attribute groups like well applications, permits, production data, activities, operations, logs, treatments, tests, drills, treatments, and KPIs. Site Hub can also model areas, lands, fields, basins, pools, platforms, eco-zones, and stratigraphic layers as specific sites, tracking their base attributes, aliases, descriptions, subcomponents and more. Midstream entities (pipelines, logistic sites, pump stations) and downstream entities (cylinders, tanks, inventories, meters, partner's sites, routes, facilities, gas stations, and competitor sites) can also be easily modeled, together with their specific attributes and relationships. Site Hub can store any type of unstructured data associated to a site. This could be stored directly or on an external content management solution, like Oracle Universal Content Management. Considering a well, for example, Site Hub can store any relevant associated multimedia file such as: CAD drawings of the well profile, structure and/or parts, engineering documents, contracts, applications, permits, logs, pictures, photos, videos and more. For any site entity, Site Hub can associate all the related assets and equipments at the site, as well as all relationships between sites, between a site and multiple parties, and between a site and any purchasable or sellable item, over time. Items can be equipment, instruments, facilities, services, products, production entities, production facilities (pipelines, batteries, compressor stations, gas plants, meters, separators, etc.), support facilities (rigs, roads, transmission or radio towers, airstrips, etc.), supplier products and services, catalogs, and more. Items can just be associated to sites using standard Site Hub features, or they can be fully mastered by implementing Oracle Product Hub. Site locations (addresses or geographical coordinates) are also managed with out-of-the-box address geo-coding capabilities coupled with Google Maps integration to deliver powerful mapping capabilities and spatial data analysis. Locations can be shared between different sites. Centered on the site location, any site can also have associated areas. Site Hub can master any site location specific information, like for example cadastral, ownership, jurisdictional, geological, seismic and more, and any site-centric area specific information, like for example economical, political, risk, weather, logistic, traffic information and more. Now if anyone ever asks you why locations need MDM, think about how all these Oil & Gas entities and attributes would translate into your business locations. To learn more about Oracle's full MDM solution for the digital oil field, here is a link to Roberto Negro's outstanding whitepaper: Oracle Site Master Data Management for mastering wells and other PPDM entities in a digital oilfield context  

    Read the article

  • EPPM Is a Must-Have Capability as Global Energy and Power Industries Eye US$38 Trillion in New Investments

    - by Melissa Centurio Lopes
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} “The process manufacturing industry is facing an unprecedented challenge: from now until 2035, cumulative worldwide investments of US$38 trillion will be required for drilling, power generation, and other energy projects,” Iain Graham, director of energy and process manufacturing for Oracle’s Primavera, said in a recent webcast. He adds that process manufacturing organizations such as oil and gas, utilities, and chemicals must manage this level of investment in an environment of constrained capital markets, erratic supply and demand, aging infrastructure, heightened regulations, and declining global skills. In the following interview, Graham explains how the right enterprise project portfolio management (EPPM) technology can help the industry meet these imperatives. Q: Why is EPPM so important for today’s process manufacturers? A: If the industry invests US$38 trillion without proper cost controls in place, a huge amount of resources will be put at risk, especially when it comes to cost overruns that may occur in large capital projects. Process manufacturing companies must not only control costs, but also monitor all the various contractors that will be involved in each project. If you’re not managing your own workers and all the interdependencies among the different contractors, then you’ve got problems. Q: What else should process manufacturers look for? A: It’s also important that an EPPM solution has the ability to manage more than just capital projects. For example, it’s best to manage maintenance and capital projects in the same system. Say you’re due to install a new transformer in a power station as part of a capital project, but routine maintenance in that area of the facility is scheduled for that morning. The lack of coordination could lead to unforeseen delays. There are also IT considerations that impact capital projects, such as adding servers and network cable for a control system in a power station. What organizations need is a true EPPM system that’s not just for capital projects, maintenance, or IT activities, but instead an enterprisewide solution that provides visibility into all types of projects. Read the complete Q&A here and discover the practical framework for successfully managing this massive capital spending.

    Read the article

  • Prevent full table scan for query with multiple where clauses

    - by Dave Jarvis
    A while ago I posted a message about optimizing a query in MySQL. I have since ported the data and query to PostgreSQL, but now PostgreSQL has the same problem. The solution in MySQL was to force the optimizer to not optimize using STRAIGHT_JOIN. PostgreSQL offers no such option. Here is the explain: Here is the query: SELECT avg(d.amount) AS amount, y.year FROM station s, station_district sd, year_ref y, month_ref m, daily d LEFT JOIN city c ON c.id = 10663 WHERE -- Find all the stations within a specific unit radius ... -- 6371.009 * SQRT( POW(RADIANS(c.latitude_decimal - s.latitude_decimal), 2) + (COS(RADIANS(c.latitude_decimal + s.latitude_decimal) / 2) * POW(RADIANS(c.longitude_decimal - s.longitude_decimal), 2)) ) <= 50 AND -- Ignore stations outside the given elevations -- s.elevation BETWEEN 0 AND 2000 AND sd.id = s.station_district_id AND -- Gather all known years for that station ... -- y.station_district_id = sd.id AND -- The data before 1900 is shaky; insufficient after 2009. -- y.year BETWEEN 1980 AND 2000 AND -- Filtered by all known months ... -- m.year_ref_id = y.id AND m.month = 12 AND -- Whittled down by category ... -- m.category_id = '001' AND -- Into the valid daily climate data. -- m.id = d.month_ref_id AND d.daily_flag_id <> 'M' GROUP BY y.year It appears as though PostgreSQL is looking at the DAILY table first, which is simply not the right way to go about this query as there are nearly 300 million rows. How do I force PostgreSQL to start at the CITY table? Thank you!

    Read the article

  • AvalonDock + UserControl + DataGrid + ContextMenu command routing issue

    - by repka
    I have this kind of layout: <Window x:Class="DockAndMenuTest.MainWindow" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:ad="clr-namespace:AvalonDock;assembly=AvalonDock" Title="MainWindow" Height="350" Width="525"> <ad:DockingManager> <ad:DocumentPane> <ad:DockableContent Title="Doh!"> <UserControl> <UserControl.CommandBindings> <CommandBinding Command="Zoom" Executed="ExecuteZoom" CanExecute="CanZoom"/> </UserControl.CommandBindings> <DataGrid Name="_evilGrid"> <DataGrid.Resources> <Style TargetType="DataGridRow"> <Setter Property="ContextMenu"> <Setter.Value> <ContextMenu> <MenuItem Command="Zoom"/> </ContextMenu> </Setter.Value> </Setter> </Style> </DataGrid.Resources> </DataGrid> </UserControl> </ad:DockableContent> </ad:DocumentPane> </ad:DockingManager> </Window> Briefly: ContextMenu is set for each DataGridRow of DataGrid inside UserControl, which in its turn is inside DockableContent of AvalonDock. Code-behind is trivial as well: public partial class MainWindow { public MainWindow() { InitializeComponent(); _evilGrid.ItemsSource = new[] { Tuple.Create(1, 2, 3), Tuple.Create(4, 4, 3), Tuple.Create(6, 7, 1), }; } private void ExecuteZoom(object sender, ExecutedRoutedEventArgs e) { MessageBox.Show("zoom !"); } private void CanZoom(object sender, CanExecuteRoutedEventArgs e) { e.CanExecute = true; } } So here's the problem: right-clicking on the selected row (if it it was selected before the right click) my command comes out disabled. The command is "Zoom" in this case, but can be any other, including a custom one. If I get rid of either docking or UserControl around my grid there are no problems. ListBox doesn't have this issue either. So I don't know what's at fault here. SNOOP shows that in cases when this propagation fails, instead of UserControl, CanExecute is handled by PART_ShowContextMenuButton (Button), which is part of docking header. I've had other issues with UI command propagation within UserControls hosted inside AvalonDock, but this one is the easiest to reproduce.

    Read the article

  • Loading data from a dictionary of dictionaries into an array in Objective C for an iphone app

    - by Kat
    I have a UINavigationController consisting of a tableview I want to load some data into. I have a dictionary plist containing Dictionaries for each Train Line which in turn have dictionaries for each station with the relevant information along with one string lineName. I need to collect the station Names keys and add them to an array to populate my table (This is working). The line names are stored as a string in my lines dictionary with the key being "lineName" Root->| | |->TrainLine1(Dictionary)->| | |-> lineName (String) | |-> Station1 (Dictionary) | |-> Station2 (Dictionary) | | |->TrainLine2(Dictionary)->| | |-> lineName (String) | |-> Station1 (Dictionary) | |-> Station2 (Dictionary) Am I going about this the wrong way? Should I reorganise my plist? The code below crashes the app. - (void)viewDidLoad { self.title = @"Train Lines"; NSString *path = [[NSBundle mainBundle] pathForResource:@"lineDetails" ofType:@"plist"]; NSDictionary *dictionary = [[NSDictionary alloc] initWithContentsOfFile:path]; NSMutableArray *array = [[NSMutableArray alloc] init]; NSMutableArray *lineNameArray = [[NSMutableArray alloc] init]; NSString *key; for (key in dictionary) { NSMutableDictionary *secondDictionary = [NSDictionary dictionaryWithDictionary:[dictionary valueForKey:key]]; [lineNameArray addObject:key]; NSLog(@"Adding this in array:%@", key); [array addObject:[secondDictionary objectForKey:kLineNameKey]]; } self.trainLines = array; self.trainLineKeys = lineNameArray; NSLog(@"Array contents:%@", self.trainLineKeys); [lineNameArray release]; [array release]; [dictionary release]; [super viewDidLoad]; }

    Read the article

  • Parsing some results returned by nokogiri in ruby, getting an error message

    - by Khat
    The following code returns an error: require 'nokogiri' require 'open-uri' @doc = Nokogiri::HTML(open("http://www.amt.qc.ca/train/deux-montagnes/deux-montagnes.aspx")) #@doc = Nokogiri::HTML(File.open("deux-montagnes.html")) stations = @doc.xpath("//area") stations.each { |station| str = station reg = /href="(.*)" title="(.*)"/ href = reg.match(str)[1] title = reg.match(str)[2] page = /.*\/(.*).aspx$/.match(href)[1] puts href puts title puts page base_url = "http://www.amt.qc.ca" complete_url = base_url + href puts complete_url } ERROR: station_names_from_map.rb:9:in `block in <main>': undefined method `[]' for nil:NilClass (NoMethodError) from /opt/local/lib/ruby1.9/gems/1.9.1/gems/nokogiri-1.4.1/lib/nokogiri/xml/node_set.rb:213:in `block in each' from /opt/local/lib/ruby1.9/gems/1.9.1/gems/nokogiri-1.4.1/lib/nokogiri/xml/node_set.rb:212:in `upto' from /opt/local/lib/ruby1.9/gems/1.9.1/gems/nokogiri-1.4.1/lib/nokogiri/xml/node_set.rb:212:in `each' from station_names_from_map.rb:7:in `<main>' shell returned 1 While this code works: str = '<area shape="poly" alt="Deux-Montagnes" coords="59,108,61,106,65,106,67,108,67,113,65,115,61,115,59,113" href="/train/deux-montagnes/deux-montagnes.aspx" title="Deux-Montagnes">' reg = /href="(.*)" title="(.*)"/ href = reg.match(str)[1] title = reg.match(str)[2] page = /.*\/(.*).aspx$/.match(href)[1] puts href puts title puts page base_url = "http://www.amt.qc.ca" complete_url = base_url + href puts complete_url Any reason why?

    Read the article

  • Php sting handling triks

    - by Dam
    Hi my question Need to get the 10 word before and 10 words after for the given text . i mean need to start the 10 words before the keyword and end with 10 word after the key word. Given text : "Twenty-three" The main trick the having some html tags tags need to keep that tag with this content only the words from 10before - 10after content is bellow : <div id="hpFeatureBoxInt"><h2><span class="dy">Top News Story</span></h2><h3><a href="/go/homepage/i/int/news/world/1/-/news/1/hi/world/europe/8592190.stm">Suicide bombings hit Moscow Metro</a></h3><p>Past suicide bombings in Moscow have been blamed on Islamist rebels At least 35 people have been killed after two female suicide bombers blew themselves up on Moscow Metro trains in the morning rush hour, officials say.<img height="150" width="201" alt="Emergency services carry a body from a Metro station in Moscow (29 March 2010)" src="http://wwwimg.bbc.co.uk/feedengine/homepage/images/_47550689_moscowap203_201x150.jpg">Twenty-three died in the first blast at 0756 (0356 GMT) as a<a href="#"> train stood </a>at the central Lubyanka station, beneath the offices of the FSB intelligence agency.About 40 minutes later, a second explosion ripped through a train at Park Kultury, leaving another 12 dead.No-one has said they carried out the worst attack in the capital since 2004. </p><p id="fbilisten"><a href="/go/homepage/i/int/news/heading/-/news/">More from BBC News</a></p></div> Thank you

    Read the article

  • Optimal two variable linear regression SQL statement (censoring outliers)

    - by Dave Jarvis
    Problem Am looking to apply the y = mx + b equation (where m is SLOPE, b is INTERCEPT) to a data set, which is retrieved as shown in the SQL code. The values from the (MySQL) query are: SLOPE = 0.0276653965651912 INTERCEPT = -57.2338357550468 SQL Code SELECT ((sum(t.YEAR) * sum(t.AMOUNT)) - (count(1) * sum(t.YEAR * t.AMOUNT))) / (power(sum(t.YEAR), 2) - count(1) * sum(power(t.YEAR, 2))) as SLOPE, ((sum( t.YEAR ) * sum( t.YEAR * t.AMOUNT )) - (sum( t.AMOUNT ) * sum(power(t.YEAR, 2)))) / (power(sum(t.YEAR), 2) - count(1) * sum(power(t.YEAR, 2))) as INTERCEPT FROM (SELECT D.AMOUNT, Y.YEAR FROM CITY C, STATION S, YEAR_REF Y, MONTH_REF M, DAILY D WHERE -- For a specific city ... -- C.ID = 8590 AND -- Find all the stations within a 15 unit radius ... -- SQRT( POW( C.LATITUDE - S.LATITUDE, 2 ) + POW( C.LONGITUDE - S.LONGITUDE, 2 ) ) <15 AND -- Gather all known years for that station ... -- S.STATION_DISTRICT_ID = Y.STATION_DISTRICT_ID AND -- The data before 1900 is shaky; insufficient after 2009. -- Y.YEAR BETWEEN 1900 AND 2009 AND -- Filtered by all known months ... -- M.YEAR_REF_ID = Y.ID AND -- Whittled down by category ... -- M.CATEGORY_ID = '001' AND -- Into the valid daily climate data. -- M.ID = D.MONTH_REF_ID AND D.DAILY_FLAG_ID <> 'M' GROUP BY Y.YEAR ORDER BY Y.YEAR ) t Data The data is visualized here (with five outliers highlighted): Questions How do I return the y value against all rows without repeating the same query to collect and collate the data? That is, how do I "reuse" the list of t values? How would you change the query to eliminate outliers (at an 85% confidence interval)? The following results (to calculate the start and end points of the line) appear incorrect. Why are the results off by ~10 degrees (e.g., outliers skewing the data)? (1900 * 0.0276653965651912) + (-57.2338357550468) = -4.66958228 (2009 * 0.0276653965651912) + (-57.2338357550468) = -1.65405406 I would have expected the 1900 result to be around 10 (not -4.67) and the 2009 result to be around 11.50 (not -1.65). Thank you!

    Read the article

  • How to delete sentences starting with a lower case letter?

    - by Ron
    Hello: In the example below the following regex (".*?") was used to remove all dialogue first. The next step is to remove all remaining sentences starting with a lower case letter. Only sentences starting with an upper case letter should remain. Example: exclaimed Wade. Indeed, below them were villages, of crude huts made of timber and stone and mud. Rubble work walls, for they needed little shelter here, and the people were but savages. asked Arcot, his voice a bit unsteady with suppressed excitement. replied Morey without turning from his station at the window. Below them now, less than half a mile down on the patchwork of the Nile valley, men were standing, staring up, collecting in little groups, gesticulating toward the strange thing that had materialized in the air above them. In the example above the following should be deleted only: exclaimed Wade. asked Arcot, his voice a bit unsteady with suppressed excitement. replied Morey without turning from his station at the window. A useful regex or simple Perl or python code is appreciated. I'm using version 7 of Textpipe. Thanks.

    Read the article

  • Optimal two variable linear regression SQL statement

    - by Dave Jarvis
    Problem Am looking to apply the y = mx + b equation (where m is SLOPE, b is INTERCEPT) to a data set, which is retrieved as shown in the SQL code. The values from the (MySQL) query are: SLOPE = 0.0276653965651912 INTERCEPT = -57.2338357550468 SQL Code SELECT ((sum(t.YEAR) * sum(t.AMOUNT)) - (count(1) * sum(t.YEAR * t.AMOUNT))) / (power(sum(t.YEAR), 2) - count(1) * sum(power(t.YEAR, 2))) as SLOPE, ((sum( t.YEAR ) * sum( t.YEAR * t.AMOUNT )) - (sum( t.AMOUNT ) * sum(power(t.YEAR, 2)))) / (power(sum(t.YEAR), 2) - count(1) * sum(power(t.YEAR, 2))) as INTERCEPT FROM (SELECT D.AMOUNT, Y.YEAR FROM CITY C, STATION S, YEAR_REF Y, MONTH_REF M, DAILY D WHERE -- For a specific city ... -- C.ID = 8590 AND -- Find all the stations within a 5 unit radius ... -- SQRT( POW( C.LATITUDE - S.LATITUDE, 2 ) + POW( C.LONGITUDE - S.LONGITUDE, 2 ) ) <15 AND -- Gather all known years for that station ... -- S.STATION_DISTRICT_ID = Y.STATION_DISTRICT_ID AND -- The data before 1900 is shaky; and insufficient after 2009. -- Y.YEAR BETWEEN 1900 AND 2009 AND -- Filtered by all known months ... -- M.YEAR_REF_ID = Y.ID AND -- Whittled down by category ... -- M.CATEGORY_ID = '001' AND -- Into the valid daily climate data. -- M.ID = D.MONTH_REF_ID AND D.DAILY_FLAG_ID <> 'M' GROUP BY Y.YEAR ORDER BY Y.YEAR ) t Data The data is visualized here: Questions How do I return the y value against all rows without repeating the same query to collect and collate the data? That is, how do I "reuse" the list of t values? How would you change the query to eliminate outliers (at an 85% confidence interval)? The following results (to calculate the start and end points of the line) appear incorrect. Why are the results off by ~10 degrees (e.g., outliers skewing the data)? (1900 * 0.0276653965651912) + (-57.2338357550468) = -4.66958228 (2009 * 0.0276653965651912) + (-57.2338357550468) = -1.65405406 I would have expected the 1900 result to be around 10 (not -4.67) and the 2009 result to be around 11.50 (not -1.65). Thank you!

    Read the article

  • Optimal two variable linear regression calculation

    - by Dave Jarvis
    Problem Am looking to apply the y = mx + b equation (where m is SLOPE, b is INTERCEPT) to a data set, which is retrieved as shown in the SQL code. The values from the (MySQL) query are: SLOPE = 0.0276653965651912 INTERCEPT = -57.2338357550468 SQL Code SELECT ((sum(t.YEAR) * sum(t.AMOUNT)) - (count(1) * sum(t.YEAR * t.AMOUNT))) / (power(sum(t.YEAR), 2) - count(1) * sum(power(t.YEAR, 2))) as SLOPE, ((sum( t.YEAR ) * sum( t.YEAR * t.AMOUNT )) - (sum( t.AMOUNT ) * sum(power(t.YEAR, 2)))) / (power(sum(t.YEAR), 2) - count(1) * sum(power(t.YEAR, 2))) as INTERCEPT, FROM (SELECT D.AMOUNT, Y.YEAR FROM CITY C, STATION S, YEAR_REF Y, MONTH_REF M, DAILY D WHERE -- For a specific city ... -- C.ID = 8590 AND -- Find all the stations within a 15 unit radius ... -- SQRT( POW( C.LATITUDE - S.LATITUDE, 2 ) + POW( C.LONGITUDE - S.LONGITUDE, 2 ) ) < 15 AND -- Gather all known years for that station ... -- S.STATION_DISTRICT_ID = Y.STATION_DISTRICT_ID AND -- The data before 1900 is shaky; insufficient after 2009. -- Y.YEAR BETWEEN 1900 AND 2009 AND -- Filtered by all known months ... -- M.YEAR_REF_ID = Y.ID AND -- Whittled down by category ... -- M.CATEGORY_ID = '001' AND -- Into the valid daily climate data. -- M.ID = D.MONTH_REF_ID AND D.DAILY_FLAG_ID <> 'M' GROUP BY Y.YEAR ORDER BY Y.YEAR ) t Data The data is visualized here: Question The following results (to calculate the start and end points of the line) appear incorrect. Why are the results off by ~10 degrees (e.g., outliers skewing the data)? (1900 * 0.0276653965651912) + (-57.2338357550468) = -4.66958228 (2009 * 0.0276653965651912) + (-57.2338357550468) = -1.65405406 I would have expected the 1900 result to be around 10 (not -4.67) and the 2009 result to be around 11.50 (not -1.65). Related Sites Least absolute deviations Robust regression Thank you!

    Read the article

  • Start program on usb hardware plugin

    - by petebob796
    Is there a way to detect when a specific device is plugged into a usb port, what I would like to happen is when I plug my laptop into my docking station it run up several apps to account for my different keyboard, mouse and monitors. Specifically I have an issue with some software for my G15 keyboard stopping media player closing properly. Hopefully in .NET but if not any suggestions appreciated.

    Read the article

  • Filter large amounts of data in a table w/ jQuery

    - by Bry4n
    I work for a transit agency and I have large amounts of data (mostly times), and I need a way to filter the data using two textboxes (To and From). I found jQuery quick search, but it seems to only work with one textbox. If anyone has any ideas via jQuery or some other client side library, that would be fantastic. Ideal example: To: [Textbox] From:[Textbox] <table> <tr> <td>69th street</td><td>5:00pm</td><td>5:06pm</td><td>5:10pm</td><td>5:20pm</td> </tr> <tr> <td>Millbourne</td><td>5:09pm</td><td>5:15pm</td><td>5:20pm</td><td>5:25pm</td> </tr> <tr> <td>Spring Garden</td><td>6:00pm</td><td>6:15pm</td><td>6:20pm</td><td>6:25pm</td> </tr> </table> So If I start typing in one of the stations in the To: textbox it either displays dynamically like the quick search or i have to press a button (either or) and then in the from: textbox. Lastly it shows me to: station and all its times on the left and the from: station and all its times on the right.

    Read the article

  • Need to place a floating modeless form over excel main window (quasi-task pane)

    - by code4life
    Hi I need to emulate a task pane by floating a modeless form over the Excel main window. The reason for this requirement is that I need to have taskpane features for my Excel 2003 add-in, but cannot use the document-centric model. Can anyone suggest what would be the best way to do this? The modeless form would need to detect the main window resize event and resize itself accordingly, and also need to always position itself at the bottom of the window (kind of like a docking pane).

    Read the article

  • Remove redundant SQL code

    - by Dave Jarvis
    Code The following code calculates the slope and intercept for a linear regression against a slathering of data. It then applies the equation y = mx + b against the same result set to calculate the value of the regression line for each row. Can the two separate sub-selects be joined so that the data and its slope/intercept are calculated without executing the data gathering part of the query twice? SELECT AVG(D.AMOUNT) as AMOUNT, Y.YEAR * ymxb.SLOPE + ymxb.INTERCEPT as REGRESSION_LINE, Y.YEAR as YEAR, MAKEDATE(Y.YEAR,1) as AMOUNT_DATE FROM CITY C, STATION S, YEAR_REF Y, MONTH_REF M, DAILY D, (SELECT ((avg(t.AMOUNT * t.YEAR)) - avg(t.AMOUNT) * avg(t.YEAR)) / (stddev( t.AMOUNT ) * stddev( t.YEAR )) as CORRELATION, ((sum(t.YEAR) * sum(t.AMOUNT)) - (count(1) * sum(t.YEAR * t.AMOUNT))) / (power(sum(t.YEAR), 2) - count(1) * sum(power(t.YEAR, 2))) as SLOPE, ((sum( t.YEAR ) * sum( t.YEAR * t.AMOUNT )) - (sum( t.AMOUNT ) * sum(power(t.YEAR, 2)))) / (power(sum(t.YEAR), 2) - count(1) * sum(power(t.YEAR, 2))) as INTERCEPT FROM ( SELECT AVG(D.AMOUNT) as AMOUNT, Y.YEAR as YEAR, MAKEDATE(Y.YEAR,1) as AMOUNT_DATE FROM CITY C, STATION S, YEAR_REF Y, MONTH_REF M, DAILY D WHERE $X{ IN, C.ID, CityCode } AND SQRT( POW( C.LATITUDE - S.LATITUDE, 2 ) + POW( C.LONGITUDE - S.LONGITUDE, 2 ) ) < $P{Radius} AND S.STATION_DISTRICT_ID = Y.STATION_DISTRICT_ID AND Y.YEAR BETWEEN 1900 AND 2009 AND M.YEAR_REF_ID = Y.ID AND M.CATEGORY_ID = $P{CategoryCode} AND M.ID = D.MONTH_REF_ID AND D.DAILY_FLAG_ID <> 'M' GROUP BY Y.YEAR ) t ) ymxb WHERE $X{ IN, C.ID, CityCode } AND SQRT( POW( C.LATITUDE - S.LATITUDE, 2 ) + POW( C.LONGITUDE - S.LONGITUDE, 2 ) ) < $P{Radius} AND S.STATION_DISTRICT_ID = Y.STATION_DISTRICT_ID AND Y.YEAR BETWEEN 1900 AND 2009 AND M.YEAR_REF_ID = Y.ID AND M.CATEGORY_ID = $P{CategoryCode} AND M.ID = D.MONTH_REF_ID AND D.DAILY_FLAG_ID <> 'M' GROUP BY Y.YEAR Question How do I execute the duplicate bits only once per query, instead of twice? The duplicate bit is the WHERE clause: $X{ IN, C.ID, CityCode } AND SQRT( POW( C.LATITUDE - S.LATITUDE, 2 ) + POW( C.LONGITUDE - S.LONGITUDE, 2 ) ) < $P{Radius} AND S.STATION_DISTRICT_ID = Y.STATION_DISTRICT_ID AND Y.YEAR BETWEEN 1900 AND 2009 AND M.YEAR_REF_ID = Y.ID AND M.CATEGORY_ID = $P{CategoryCode} AND M.ID = D.MONTH_REF_ID AND D.DAILY_FLAG_ID <> 'M' Related http://stackoverflow.com/questions/1595659/how-to-eliminate-duplicate-calculation-in-sql Thank you!

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >