Search Results

Search found 17914 results on 717 pages for 'xml schema'.

Page 422/717 | < Previous Page | 418 419 420 421 422 423 424 425 426 427 428 429  | Next Page >

  • Image Grabbing with False Referer

    - by Mr Carl
    Hey guys, I'm struggling with grabbing an image at the moment... sounds silly, but check out this link :P http://manga.justcarl.co.uk/A/Oishii_Kankei/31/1 If you get the image URL, the image loads. Go back, it looks like it's working fine, but that's just the browser loading up the cached image. The application was working fine before, I'm thinking they implemented some kind of Referer check on their images. So I found some code and came up with the following... $ref = 'http://www.thesite.com/'; $file = 'theimage.jpg'; $hdrs = array( 'http' = array( 'method' = "GET", 'header'= "accept-language: en\r\n" . "Accept:application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*\/*;q=0.5\r\n" . "Referer: $ref\r\n" . // Setting the http-referer "Content-Type: image/jpeg\r\n" ) ); // get the requested page from the server // with our header as a request-header $context = stream_context_create($hdrs); $fp = fopen($imgChapterPath.$file, 'rb', false, $context); fpassthru($fp); fclose($fp); Essentially it's making up a false referrer. All I'm getting back though is a bunch of gibberish (thanks to fpassthru) so I think it's getting the image, but I'm afraid to say I have no idea how to output/display the collected image. Cheers, Carl

    Read the article

  • How to search cvs comment history

    - by Chris Noe
    I am aware of this command: cvs log -N -w<userid> -d"1 day ago" Unfortunately this generates a formatted report with lots of newlines in it, such that the file-path, the file-version, and the comment-text are all on separate lines. Therefore it is difficult to scan it for all occurrences of comment text, (eg, grep), and correlate the matches to file/version. (Note that the log output would be perfectly acceptable, if only cvs could perform the filtering natively.) EDIT: Sample output. A block of text like this is reported for each repository file: RCS file: /data/cvs/dps/build.xml,v Working file: build.xml head: 1.49 branch: locks: strict access list: keyword substitution: kv total revisions: 57; selected revisions: 1 description: ---------------------------- revision 1.48 date: 2008/07/09 17:17:32; author: noec; state: Exp; lines: +2 -2 Fixed src.jar references ---------------------------- revision 1.47 date: 2008/07/03 13:13:14; author: noec; state: Exp; lines: +1 -1 Fixed common-src.jar reference. =============================================================================

    Read the article

  • NHibernate is not connecting to sql server.

    - by user177883
    When i set up a regular connection, it works, however when i try to use nhibernate, hibernate.cfg.xml, i m getting the following error. Message="A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified)" Source=".Net SqlClient Data Provider" What would be the reason for this and how can i resolve it ? I doubt that it s a network or sql server configuration error. <?xml version="1.0" ?> <hibernate-configuration xmlns="urn:nhibernate-configuration-2.2" > <session-factory> <property name="connection.provider">NHibernate.Connection.DriverConnectionProvider</property> <property name="dialect">NHibernate.Dialect.MsSql2005Dialect</property> <property name="connection.driver_class">NHibernate.Driver.SqlClientDriver</property> <property name="connection.connection_string">Server=(ServerName\DEV_ENV);Initial Catalog=dbName;User Id=SA;Password=PASS</property> <property name="proxyfactory.factory_class">NHibernate.ByteCode.LinFu.ProxyFactoryFactory, NHibernate.ByteCode.LinFu</property> </session-factory> </hibernate-configuration>

    Read the article

  • Hadoop reduce task gets hung

    - by user806098
    I set up a hadoop cluster with 4 nodes, When running a map-reduce task, the map task finishes quickly, while the reduce task hangs at 27% percent. I checked the log, it's that the reduce task fails to fetch map output from map nodes. The job tracker log of master shows messages like this: 2011-06-27 19:55:14,748 INFO org.apache.hadoop.mapred.JobTracker: Adding task (REDUCE) 'attempt_201106271953_0001_r_000000_0' to tip task_201106271953_0001_r_000000, for tracker 'tracker_web30.bbn.com.cn:localhost/127.0.0.1:56476' And the name node log of master shows messages like this: 2011-06-27 14:00:52,898 INFO org.apache.hadoop.ipc.Server: IPC Server handler 4 on 54310, call register(DatanodeRegistration(202.106.199.39:50010, storageID=DS-1989397900-202.106.199.39-50010-1308723051262, infoPort=50075, ipcPort=50020)) from 192.168.225.19:16129: error: java.io.IOException: verifyNodeRegistration: unknown datanode 202.106.199.3 9:50010 However, neither the "web30.bbn.com.cn" or 202.106.199.39, 202.106.199.3 is the slave node. I think such ip/domains appear because hadoop fails to resolve a node(first in the Intranet DNS server), then it goes to a higher-level DNS server, later to the top, still fails, then the "junk" ip/domains are returned. But I checked my config, it goes like this: /etc/hosts: 127.0.0.1 localhost.localdomain localhost ::1 localhost6.localdomain6 localhost6 192.168.225.16 master 192.168.225.66 slave1 192.168.225.20 slave5 192.168.225.17 slave17 conf/core-site.xml: hadoop.tmp.dir /root/hadoop_tmp/hadoop_${user.name} fs.default.name hdfs://master:54310 io.sort.mb 1024 hdfs-site.xml: dfs.replication 3 masters: master slaves: master slave1 slave5 slave17 Also, all firewalls(iptables) are turned off, and ssh between each 2 nodes is ok. so I don't know where exact the error comes from. Please help. Thanks a lot.

    Read the article

  • Rich faces3.3.2 is not working in WebSphear 7

    - by palakolanusrinu
    Hi, I'm new to this rich faces and JSF i have developed one sample app and its working fine in tomcat6 now same app i have moved to WebSphear7 here i'm facing the problem. Exception on console: Unable to locate the tag library for uri //richfaces.org/rich" List of jars in build path: Commons-beanutils-1.8.3.jar commons-codec-1.4.jar commons-collections-3.2.1.jar commons-digester-2.0.jar commons-discovery-0.4.jar conmmons-logging.jar jsf-api.jar jsf-impl.jar jstl-1.2.jar jstl-api-1.2.jar jst-impl-1.2.jar richfaces-api-3.3.2.SR1.jar richfaces-impl-3.3.2.SR1.jar richfaces-ui-3.3.2.SR1.jar Web.xml is http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd" id="WebApp_ID" version="2.5" bbh index.html index.htm index.jsp default.html default.htm default.jsp Faces Servlet javax.faces.webapp.FacesServlet 1 Faces Servlet *.jsf State saving method: 'client' or 'server' (=default). See JSF Specification 2.5.2 javax.faces.STATE_SAVING_METHOD client javax.servlet.jsp.jstl.fmt.localizationContext resources.application com.sun.faces.config.ConfigureListener Faces Servlet *.faces <context-param> <param-name>org.richfaces.SKIN</param-name> <param-value>blueSky</param-value> </context-param> <filter> <display-name>RichFaces Filter</display-name> <filter-name>richfaces</filter-name> <filter-class>org.ajax4jsf.Filter</filter-class> <init-param> <param-name>createTempFiles</param-name> true maxRequestSize 2000000 richfaces Faces Servlet REQUEST FORWARD INCLUDE My JSP including the tag libs are like <%@taglib uri="http://java.sun.com/jsf/core" prefix="f"% <%@taglib uri="http://java.sun.com/jsf/html" prefix="h"% <%@taglib uri="http://java.sun.com/jsp/jstl/core" prefix="c"% <%@ taglib uri="http://richfaces.org/a4j" prefix="a4j"% <%@ taglib uri="http://richfaces.org/rich" prefix="rich"% Please Help me and thx in advance

    Read the article

  • What is the technological basis for Azure DocumentDB?

    - by user1703840
    Microsoft announced the availability of Azure DocumentDB as follows... A fully-managed, highly-scalable, NoSQL document database service. - Rich query over a schema-free JSON data model - Transactional execution of JavaScript logic - Scalable storage and throughput - Tunable consistency - Rapid development with familiar technologies - Blazingly fast and write optimized database service I really like the "transactional execution of JavaScript logic". Sounds like an approach similar to that of PostgreSQL NoSQL. Anyone know what is the technological basis for the Azure DocumentDB service? SQL Server?

    Read the article

  • Keep Hibernate Initializer from Crashing Program

    - by manyxcxi
    I have a Java program using a basic Hibernate session factory. I had an issue with a hibernate hbm.xml mapping file and it crashed my program even though I had the getSessionFactory() call in a try catch try { session = SessionFactoryUtil.getSessionFactory().openStatelessSession(); session.beginTransaction(); rh = getRunHistoryEntry(session); if(rh == null) { throw new Exception("No run history information found in the database for run id " + runId_ + "!"); } } catch(Exception ex) { logger.error("Error initializing hibernate"); } It still manages to break out of this try/catch and crash the main thread. How do I keep it from doing this? The main issue is I have a bunch of cleanup commands that NEED to be run before the main thread shuts down and need to be able to guarantee that even after a failure it still cleans up and goes down somewhat gracefully. The session factory looks like this: public class SessionFactoryUtil { private static final SessionFactory sessionFactory; static { try { // Create the SessionFactory from hibernate.cfg.xml sessionFactory = new Configuration().configure().buildSessionFactory(); } catch (Throwable ex) { // Make sure you log the exception, as it might be swallowed System.err.println("Initial SessionFactory creation failed." + ex); throw new ExceptionInInitializerError(ex); } } public static SessionFactory getSessionFactory() { try { return sessionFactory; } catch(Exception ex) { return null; } } }

    Read the article

  • Windows Azure Table Storage LINQ Operators

    - by Ryan Elkins
    Currently Table Storage supports From, Where, Take, and First. Are there plans to support any of the other 29 operators? If we have to code for these ourselves, how much of a performance difference are we looking at to something similar via SQL and SQL Server? Do you see it being somewhat comparable or will it be far far slower if I need to do a Count or Sum or Group By over a gigantic dataset? I like the Azure platform and the idea of cloud based storage. I like Windows Azure for the amount of data it can store and the schema-less nature of table storage. SQL Azure just won't work due to the high cost to storage space.

    Read the article

  • Recommendations of a high volume log event viewer in a Java enviroment

    - by Thorbjørn Ravn Andersen
    I am in a situation where I would like to accept a LOT of log events controlled by me - notably the logging agent I am preparing for slf4j - and then analyze them interactively. I am not as such interested in a facility that presents formatted log files, but one that can accept log events as objects and allow me to sort and display on e.g. threads and timelines etc. Chainsaw could maybe be an option but is currently not compatible with logback which I use for technical reasons. Is there any project with stand alone viewers or embedded in an IDE which would be suitable for this kind of log handling. I am aware that I am approaching what might be suitable for a profiler, so if there is a profiler projekt suitable for this kind of data acquisition and display where I can feed the event pipe, I would like to hear about it). Thanks for all feedback Update 2009-03-19: I have found that there is not a log viewer which allows me to see what I would like (a visual display of events with coordinates determined by day and time, etc), so I have decided to create a very terse XML format derived from the log4j XMLLayout adapted to be as readable as possible while still being valid XML-snippets, and then use the Microsoft LogParser to extract the information I need for postprocessing in other tools.

    Read the article

  • Hibernate Bi- Directional many to many mapping advice!

    - by Rob
    hi all, i woundered if anyone might be able to help me out. I am trying to work out what to google for (or any other ideas!!) basically i have a bidirectional many to many mapping between a user entity and a club entity (via a join table called userClubs) I now want to include a column in userClubs that represents the role so that when i call user.getClubs() I can also work out what level access they have. Is there a clever way to do this using hibernate or do i need to rethink the database structure? Thank you for any help (or just for reading this far!!) the user.hbm.xml looks a bit like <set name="clubs" table="userClubs" cascade="save-update"> <key column="user_ID"/> <many-to-many column="activity_ID" class="com.ActivityGB.client.domain.Activity"/> </set> the activity.hbm.xml part <set name="members" inverse="true" table="userClubs" cascade="save-update"> <key column="activity_ID"/> <many-to-many column="user_ID" class="com.ActivityGB.client.domain.User"/> </set> The current userClubs table contains the fields id | user_ID | activity_ID I would like to include in there id | user_ID | activity_ID | role and be able to access the role on both sides...

    Read the article

  • Specifying both multiple targets and multiple build files with ant or subant in 1.6

    - by Paul Marshall
    I'm trying to unify a build process, running one build to get multiple packages. My first shot at this is just having a central build script call <ant or <subant on each project's build.xml file. I'm using Ant 1.6, and I've run into a funny problem: either I use the <ant task, and I can specify multiple targets but not multiple build files, or I use the <subant task, and I can specify multiple build files but not multiple targets. I realize there's a few solutions here already: Just upgrade to Ant 1.7 already; <antcall can do multiple targets there. Edit the separate project build files to have a variety of top-level targets, so I can call each individual file with just one target, and use <antcall. Copy-paste a lot of <ant tasks, with a little help from <macrodef to help the sanity. Is there something I've missed, that will allow me to do what I want from this single central build.xml without a) editing individual project files, b) writing lots of repetitive code, or c) upgrading Ant, and that d) doesn't require editing every time I add a new project?

    Read the article

  • innerHTML doesn't work correctly with xhtml in Chrome

    - by Desperadeus
    Hi! I've got a trouble with Chrome5.0.375.70, but FF 3.6.3 and Opera 10.53 are OK. Below is the line of code: document.getElementById('content').innerHTML = data.documentElement.innerHTML; The data object from the code is a document (typeof(data) == 'object') and I've got it by ajax request to chapter01.xhtml: <?xml version="1.0" encoding="utf-8"?> <!DOCTYPE html [ <!ENTITY D "&#x2014;"> <!ENTITY o "&#x2018;"> <!ENTITY c "&#x2019;"> <!ENTITY O "&#x201C;"> <!ENTITY C "&#x201D;"> ]> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <title>Alice's Adventures in Wonderland by Lewis Carroll. Chapter I: Down the Rabbit-Hole</title> <link rel="stylesheet" type="text/css" href="style.css"/> <link rel="stylesheet" type="application/vnd.adobe-page-template+xml" href="page-template.xpgt"/> </head> <body> <div class="title_box"> <h2 class="chapnum">Chapter I</h2> <h2 class="chaptitle">Down the Rabbit-Hole</h2> <hr/> </div> The Chrome cuts all before body and as a result link to css in header is missed; user can't see formatted text and images. How can I fix it or bypass?

    Read the article

  • mvn deploy to AWS (ssh via distributionManagement)

    - by Dexter
    I am working on deploying the WAR file to AWS using Maven. I am planning to use 'mvn deploy' for the same which would ssh the war file to AWS. I am following http://maven.apache.org/plugins/maven-deploy-plugin/examples/deploy-ssh-external.html. This is my POM file <project> ... <distributionManagement> <repository> <id>ssh-aws</id> <url>scpexe://<ec2 instance>.compute-1.amazonaws.com</url> </repository> </distributionManagement> <build> <extensions> <!-- Enabling the use of FTP --> <extension> <groupId>org.apache.maven.wagon</groupId> <artifactId>wagon-ssh-external</artifactId> <version>1.0-beta-6</version> </extension> </extensions> </build> .. </project> This is my settings.xml <server> <id>ssh-aws</id> <username>aws-user</username> </server> The only issue is that I am unable to figure out the url in distributionManagement node of pom.xml. I am able to ssh in the AWS server by the following. ssh -i ~/pemfile/pemfile-key.pem aws-user@<ec2 instance>.compute-1.amazonaws.com But when I run mvn clean deploy, I receive this.. Exit code: 1 - Permission denied (publickey). -> [Help 1] Thanks in advance.

    Read the article

  • java logging nightmare and log4j not behaving as expected with spring + tomcat6

    - by maverick
    I have a spring application that has configured log4j (via xml) and that runs on Tomcat6 that was working fine until we add a bunch of dependencies via Maven. At some point the whole application just started logging part of what it was supposed to be declared into the log4.xml "a small rant here" Why logging has to be that hard in java world? why suddenly an application that was just fine start behaving so weird and why it's so freaking hard to debug? I've been reading and trying to solve this issue for days but so far no luck, hopefully some expert here can give me some insights on this I've added log4j debug option to check whether log4j is taking reading the config file and its values and this is what part of it shows log4j: Level value for org.springframework.web is [debug]. log4j: org.springframework.web level set to DEBUG log4j: Retreiving an instance of org.apache.log4j.Logger. log4j: Setting [org.compass] additivity to [true]. log4j: Level value for org.compass is [debug]. log4j: org.compass level set to DEBUG As you can see debug is enabled for compass and spring.web but it only shows "INFO" level for both packages. My log4j config file has nothing out of extraordinary just a plain ConsoleAppender <log4j:configuration xmlns:log4j="http://jakarta.apache.org/log4j/"> <!-- Appenders --> <appender name="console" class="org.apache.log4j.ConsoleAppender"> <param name="Target" value="System.out" /> <layout class="org.apache.log4j.PatternLayout"> <param name="ConversionPattern" value="%-5p: %c - %m%n" /> </layout> </appender> What's the trick to make this work? What it's my misunderstanding here? Can someone point me in the right direction and explain how can I make this logging mess more bullet proof?

    Read the article

  • Getting elements children with certain tag jQuery

    - by johnnyArt
    I'm trying to get all the input elements from a certain form from jQuery by providing only the name of the form and only knowing that those wanted fields are input elements. Let's say: <form action='#' id='formId'> <input id='name' /> <input id='surname'/> </form> How do I access them individually with jQuery? I tried something like $('#formId > input') with no success, in fact an error came back on the console "XML filter is applied to non-XML value (function (a, b) {return new (c.fn.init)(a, b);})" Maybe I have to do it with .children or something like that? I'm pretty new at jQuery and I'm not really liking the Docs. It was much friendlier in Mootools, or maybe I just need to get used to it. Oh and last but not least, I've seen it asked before but no final answer, can I create a new dom element with jQuery and work with it before inserting it (if I ever do) into de code? In mootools, we had something like var myEl = new Element(element[, properties]); and you could then refer to it in further expressions, but I fail to understand how to do that on jQuery Thanks in advance.

    Read the article

  • using pom for test scope dependencies

    - by IttayD
    Hi, Is it possible to create a pom file so it can be used inside another pom to add test scope dependencies? So in module E's pom.xml I have: <dependencies> <dependency> <groupId>com.example</artifactId> <artifactId>D</artifactId> <type>pom</type> <scope>test</scope> </dependency> </dependencies> So that if D's pom.xml contains dependencies on artifacts A, B, C, then these artifacts are in the compilation and execution classpath of E's tests. NOTE: the reason I want such a pom, and not rely on regular dependency resolution is that I have created a tests jar using maven-jar-plugin:test-jar and using that jar as a dependency causes maven to not use its transitive dependencies. (see http://jira.codehaus.org/browse/MNG-1378) UPDATE: this does not work for me (maybe because I'm trying to use it for the test scope): http://www.sonatype.com/books/mvnref-book/reference/pom-relationships-sect-pom-best-practice.html

    Read the article

  • accessing a value of a nested hash

    - by st
    Hello! I am new to perl and I have a problem that's very simple but I cannot find the answer when consulting my perl book. When printing the result of Dumper($request); I get the following result: $VAR1 = bless( { '_protocol' => 'HTTP/1.1', '_content' => '', '_uri' => bless( do{\(my $o = 'http://myawesomeserver.org:8081/counter/')}, 'URI::http' ), '_headers' => bless( { 'user-agent' => 'Mozilla/5.0 (X11; U; Linux i686; en; rv:1.9.0.4) Gecko/20080528 Epiphany/2.22 Firefox/3.0', 'connection' => 'keep-alive', 'cache-control' => 'max-age=0', 'keep-alive' => '300', 'accept' => 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8', 'accept-language' => 'en-us,en;q=0.5', 'accept-encoding' => 'gzip,deflate', 'host' => 'localhost:8081', 'accept-charset' => 'ISO-8859-1,utf-8;q=0.7,*;q=0.7' }, 'HTTP::Headers' ), '_method' => 'GET', '_handle' => bless( \*Symbol::GEN0, 'FileHandle' ) }, 'HTTP::Server::Simple::Dispatched::Request' ); How can I access the values of '_method' ('GET') or of 'host' ('localhost:8081'). I know that's an easy question, but perl is somewhat cryptic at the beginning. Thank you, St.

    Read the article

  • SQL latest/top items in category

    - by drozzy
    What is a scalable way to select latest 10 items from each category. I have a schema list this: item category updated so I want to select 10 last update items from each category. The current solution I can come up with is to query for categories first and then issue some sort of union query: query = none for cat in categories: query += select top 10 from table where category=cat order by updated I am not sure how efficient this will be for bigger databases (1 million rows). If there is a way to do this in one go - that would be nice. Any help appreciated.

    Read the article

  • Looping and recursion based on parameter passed

    - by contactkx
    I have an XML organized like below- <section name="Parent 1 Text here" ID="1" > <section name="Child 1 Text here" ID="11"> </section> <section name="Child 2 Text here" ID="12"> <section name="GrandChild Text here" ID="121" > </section> </section> </section> <section name="Parent 2 Text here" ID="2" > <section name="Child 1 Text here" ID="22"> </section> <section name="Child 2 Text here" ID="23"> <section name="GrandChild Text here" ID="232" > </section> </section> </section> I have to produce the below output XML - <section name="Parent 1 Text here" ID="1" > <section name="Child 2 Text here" ID="12"> <section name="GrandChild Text here" ID="121" > </section> </section> </section> <section name="Parent 2 Text here" ID="2" > <section name="Child 2 Text here" ID="23"> </section> </section> I have to achive above using XSLT 1.0 transformation. I was planning to pass a comma separated string as a parameter with value= "1,12,121,2,23" My question- How to loop the comma separated parameter in XSLT 1.0 ? Is there a simpler way to achieve the above. Please remember I have to do this in XSLT 1.0 Your help is appreciated.

    Read the article

  • Importing Sqlite data into Google App Engine

    - by Keck
    I have a relatively extensive sqlite database that I'd like to import into my Google App Engine python app. I've created my models using the appengine API which are close, but not quite identical to the existing schema. I've written an import script to load the data from sqlite and create/save new appengine objects, but the appengine environment blocks me from accessing the sqlite library. This script is only to be run on my local app engine instance, and from there I hope to push the data to google. Am I approaching this problem the wrong way, or is there a way to import the sqlite library while running in the local instance's environment?

    Read the article

  • locked stored procedures in sql

    - by Greg
    Hi, I am not too familiar with sql server 2005. I have a schema in sql which has stored procedures with small lock on them. As I understand they were created using C#, all these locked procedures have a source file in C# with the code of the procedures. The thing is I can't access them. I need to modify one of these procedures but it doesn't let me modify them. I have the source code (from visual studio) with these procedures but when I change something in the code, it doesn't affect the procedures in the sql. How can I change the path to assembly in sql server 2005 or is there any other way I can access these stored procedures? Thanks in advance, Greg

    Read the article

  • how to get individual(s) for given class/individual with given object property using SPARQL

    - by udayalkonline
    hi, I have simple ontology called "campus.owl".There is a class called"Lecturer" and which has two sub classes ,RegularLecturer and VisitingLecturer.There is a another class called "Student" which is a sibling class of Lecturer class. I have created individuals for all the classes. Student class is joind with Lecture class with "has" object property. problem I want to get some Lecturer/VisitingLecturer individuals for given student individual. Could you please help me to get this result! Thanks in advance! PREFIX rdfs: http://www.w3.org/2000/01/rdf-schema# PREFIX rdf: http://www.w3.org/1999/02/22-rdf-syntax-ns# PREFIX my: http://www.semanticweb.org/ontologies/2010/5/Ontology1275975684120.owl# SELECT ?lec WHERE { ?lec..........??? } any idea..?? Thank in advance!

    Read the article

  • Query column and everything subordinate (hard to describe, non native speaker, PLS let me explain)

    - by MAD9
    A few weeks ago, I asked a question about how to generate hierarchical XML from a table, that has a parentID column. It all works fine. The point is, according to the hierarchy, I also want to query a table. I'll give you an example: Thats the table with the codes: ID CODE NAME PARENTID 1 ROOT IndustryCode NULL 2 IND Industry 1 3 CON Consulting 1 4 FIN Finance 1 5 PHARM Pharmaceuticals 2 6 AUTO Automotive 2 7 STRAT Strategy 3 8 IMPL Implementation 3 9 CFIN Corporate Finance 4 10 CMRKT Capital Markets 9 From which I generate (for displaying in a TreeViewControl) this XML: <record key="1" parentkey="" Code="ROOT" Name="IndustryCode"> <record key="2" parentkey="1" Code="IND" Name="Industry"> <record key="5" parentkey="2" Code="PHARM" Name="Pharmaceuticals" /> <record key="6" parentkey="2" Code="AUTO" Name="Automotive" /> </record> <record key="3" parentkey="1" Code="CON" Name="Consulting"> <record key="7" parentkey="3" Code="STRAT" Name="Strategy" /> <record key="8" parentkey="3" Code="IMPL" Name="Implementation" /> </record> <record key="4" parentkey="1" Code="FIN" Name="Finance"> <record key="9" parentkey="4" Code="CFIN" Name="Corporate Finance"> <record key="10" parentkey="9" Code="CMRKT" Name="Capital Markets" /> </record> </record> </record> As you can see, some codes are subordinate to others, for example AUTO << IND << ROOT What I want (and have absolutely no idea how to realise or even, where to start) is to be able to query another table (where one column is this certain code of course) for a code and get all records with the specific code and all subordinate codes For example: I query the other table for "IndustryCode = IND[ustry]" and get (of course) the records containing "IND", but also AUTO[motive] and PHARM[aceutical] (= all subordinates) Its an SQL Express Server 2008 with Advanced Services.

    Read the article

  • Blackberry application works in simulator but not device

    - by Kai
    I read some of the similar posts on this site that deal with what seems to be the same issue and the responses didn't really seem to clarify things for me. My application works fine in the simulator. I believe I'm on Bold 9000 with OS 4.6. The app is signed. My app makes an HTTP call via 3G to fetch an XML result. type is application/xhtml+xml. In the device, it gives no error. it makes no visual sign of error. I tell the try catch to print the results to the screen and I get nothing. HttpConnection was taken right out of the demos and works fine in sim. Since it gives no error, I begin to reflect back on things I recall reading back when the project began. deviceside=true? Something like that? My request is simply HttpConnection connection = (HttpConnection)Connector.open(url); where url is just a standard url, no get vars. Based on the amount of time I see the connection arrows in the corner of the screen, I assume the app is launching the initial communication to my server, then either getting a bad result, or it gets results and the persistent store is not functioning as expected. I have no idea where to begin with this. Posting code would be ridiculous since it would be basically my whole app. I guess my question is if anyone knows of any major differences with device versus simulator that could cause something like http connection or persistent store to fail? A build setting? An OS restriction? Any standard procedure I may have just not known about that everyone should do before beginning device testing? Thanks

    Read the article

  • Many-To-Many dimensional model

    - by Mevdiven
    Folks, I have a dimension table called DIM_FILE which holds information of the files we received from customers. Each file has detail records which constitutes my FACT table, CUST_DETAIL. In the main process, file is gone through several stages and each stage tags a status to it. Long in a short, I have many-to-many relationship. Any ideas around star schema dimensional modeling. A customer record only belong to a single file and a file can have multiple statuses. FACT ---- CustID FileID AmountDue DIM_FILE -------- FileID FileName DateReceived FILE_STATUS ----------- FileID StatusDateTime StatusCode

    Read the article

< Previous Page | 418 419 420 421 422 423 424 425 426 427 428 429  | Next Page >