Search Results

Search found 765 results on 31 pages for 'mr yoshida'.

Page 12/31 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • links for 2011-03-08

    - by Bob Rhubart
    The Empowered Business "Someone needs to be the enterprise parent that asks the question, “do you really need that?” It may be a shiny new thing, but does it make a difference in the ability to accomplish the strategy and goals?" - Enterprise Architect Todd Biske (tags: enterprisearchitecture) Knowledge Workers in the British Raj "While we’ve used technology to change business, business has also evolved to the point that it’s changing how we think about and use technology." - Peter Evans Greenwood (tags: enterprisearchitecture enterprise2.0) Arun Gupta, Miles to go ...: OTN Developer Day Boston 2011 - Slides & Trip Report Arun Gupta shares slides from his Developer Day presentations. (tags: oracle otn java) Use WLST to Delete All JMS Messages From a Destination (James Bayer's Blog) James Bayer responds to a question. (tags: oracle otn weblogic jms) Triangle Circle Square: Apex in the Amazon Cloud Scott Wesley shares several links to resources covering Oracle Apex on an Amazon EC2 instance. (tags: oracle apex ec2 amazon cloud) William Vambenepe: Reading IBM's proposed standard for Cloud Architecture The always entertaining William Vambenepe gives IBM's proposed Cloud standards the full Ebert. (tags: oracle cloud ibm standards) Government Information Group Cloud Computing Research Study "The twin pressures of reduced budgets and the need for greater efficiency have led the federal government to strongly promote cloud computing as a solution whenever possible." (tags: cloudcomputing cloud) The Ron Batra Blog: Technology Whispers: Top 10 Reasons to go ExaData "Continuing my exploration of ExaData, I thought I'd take a minute to consolidate my thoughts into key reasons for which Oracle ExaData could be a good fit for your needs." - Oracle ACE Director Ron Batra (tags: oracle oracleace exadata) Oracle WebCenter: Composite Applications & Mash-Ups (Oracle Enterprise 2.0 Blog) "The new Business Mash-up editor allows business users to take any Oracle Application or 3rd party application and wire the backend data sources or APIs to a rich set of visualizations and reuse them in mashups." (tags: oracle webcenter enterprise2.0) Antonio Romero: Great Discussion of ETL and ELT Tooling in TDWI Linkedin Group Antonio says: "There’s a great discussion of ETL and ELT tooling going on in the official TDWI Linkedin group, under the heading 'How Sustainable is SQL for ETL?' It delves into a wide range of topics." (tags: oracle linkedin etl elt) YouTube - Bunny Inc. - Episode 1. Mr. CIO meets Mr. Executive Manager Yes, it's a commercial. But it's well done and it's funny. (tags: e20 enterprise2.0 webcenter) Markus Eisele: Both Weblogic and Glassfish are strategic products for Oracle Oracle ACE Director Markus Eisele shares selected quotes pulled from the recent TechCast Live interview with Oracle's Anil Gaur and Adam Leftik (tags: oracle java weblogic glassfish) How to become an Oracle SOA expert? (SOA Partner Community Blog) Jurgan Kress shares info and links for those interested in capitalizing on SOA. (tags: oracle soa)

    Read the article

  • Industrial Manufacturing Industry Trends and Challenges

    Mr. Lorne Jones, Senior Director of Oracle's Global Manufacturing Strategy and Marketing practice, talks with Fred about the challenges industrial manufacturing companies face, and Oracle's expertise in supporting their Lean enterprise initiatives with Oracle enterprise solutions.

    Read the article

  • Review the New Migration Guide to SQL Server 2012 Always On

    - by KKline
    I had the pleasure of meeting Mr. Cephas Lin, of Microsoft, last year at the SQL Saturday in Indianapolis and then later at the PASS Summit in the fall. Cephas has been writing content for SQL Server 2012 Always On. Cephas has recently published his first whitepaper, a migration guide to SQL Server AlwaysOn. Read it and then pass along any feedback: HERE Enjoy, -Kev - Follow me on Twitter !...(read more)

    Read the article

  • MapRedux - PowerShell and Big Data

    - by Dittenhafer Solutions
    MapRedux – #PowerShell and #Big Data Have you been hearing about “big data”, “map reduce” and other large scale computing terms over the past couple of years and been curious to dig into more detail? Have you read some of the Apache Hadoop online documentation and unfortunately concluded that it wasn't feasible to setup a “test” hadoop environment on your machine? More recently, I have read about some of Microsoft’s work to enable Hadoop on the Azure cloud. Being a "Microsoft"-leaning technologist, I am more inclinded to be successful with experimentation when on the Windows platform. Of course, it is not that I am "religious" about one set of technologies other another, but rather more experienced. Anyway, within the past couple of weeks I have been thinking about PowerShell a bit more as the 2012 PowerShell Scripting Games approach and it occured to me that PowerShell's support for Windows Remote Management (WinRM), and some other inherent features of PowerShell might lend themselves particularly well to a simple implementation of the MapReduce framework. I fired up my PowerShell ISE and started writing just to see where it would take me. Quite simply, the ScriptBlock feature combined with the ability of Invoke-Command to create remote jobs on networked servers provides much of the plumbing of a distributed computing environment. There are some limiting factors of course. Microsoft provided some default settings which prevent PowerShell from taking over a network without administrative approval first. But even with just one adjustment, a given Windows-based machine can become a node in a MapReduce-style distributed computing environment. Ok, so enough introduction. Let's talk about the code. First, any machine that will participate as a remote "node" will need WinRM enabled for remote access, as shown below. This is not exactly practical for hundreds of intended nodes, but for one (or five) machines in a test environment it does just fine. C:> winrm quickconfig WinRM is not set up to receive requests on this machine. The following changes must be made: Set the WinRM service type to auto start. Start the WinRM service. Make these changes [y/n]? y Alternatively, you could take the approach described in the Remotely enable PSRemoting post from the TechNet forum and use PowerShell to create remote scheduled tasks that will call Enable-PSRemoting on each intended node. Invoke-MapRedux Moving on, now that you have one or more remote "nodes" enabled, you can consider the actual Map and Reduce algorithms. Consider the following snippet: $MyMrResults = Invoke-MapRedux -MapReduceItem $Mr -ComputerName $MyNodes -DataSet $dataset -Verbose Invoke-MapRedux takes an instance of a MapReduceItem which references the Map and Reduce scriptblocks, an array of computer names which are the remote nodes, and the initial data set to be processed. As simple as that, you can start working with concepts of big data and the MapReduce paradigm. Now, how did we get there? I have published the initial version of my PsMapRedux PowerShell Module on GitHub. The PsMapRedux module provides the Invoke-MapRedux function described above. Feel free to browse the underlying code and even contribute to the project! In a later post, I plan to show some of the inner workings of the module, but for now let's move on to how the Map and Reduce functions are defined. Map Both the Map and Reduce functions need to follow a prescribed prototype. The prototype for a Map function in the MapRedux module is as follows. A simple scriptblock that takes one PsObject parameter and returns a hashtable. It is important to note that the PsObject $dataset parameter is a MapRedux custom object that has a "Data" property which offers an array of data to be processed by the Map function. $aMap = { Param ( [PsObject] $dataset ) # Indicate the job is running on the remote node. Write-Host ($env:computername + "::Map"); # The hashtable to return $list = @{}; # ... Perform the mapping work and prepare the $list hashtable result with your custom PSObject... # ... The $dataset has a single 'Data' property which contains an array of data rows # which is a subset of the originally submitted data set. # Return the hashtable (Key, PSObject) Write-Output $list; } Reduce Likewise, with the Reduce function a simple prototype must be followed which takes a $key and a result $dataset from the MapRedux's partitioning function (which joins the Map results by key). Again, the $dataset is a MapRedux custom object that has a "Data" property as described in the Map section. $aReduce = { Param ( [object] $key, [PSObject] $dataset ) Write-Host ($env:computername + "::Reduce - Count: " + $dataset.Data.Count) # The hashtable to return $redux = @{}; # Return Write-Output $redux; } All Together Now When everything is put together in a short example script, you implement your Map and Reduce functions, query for some starting data, build the MapReduxItem via New-MapReduxItem and call Invoke-MapRedux to get the process started: # Import the MapRedux and SQL Server providers Import-Module "MapRedux" Import-Module “sqlps” -DisableNameChecking # Query the database for a dataset Set-Location SQLSERVER:\sql\dbserver1\default\databases\myDb $query = "SELECT MyKey, Date, Value1 FROM BigData ORDER BY MyKey"; Write-Host "Query: $query" $dataset = Invoke-SqlCmd -query $query # Build the Map function $MyMap = { Param ( [PsObject] $dataset ) Write-Host ($env:computername + "::Map"); $list = @{}; foreach($row in $dataset.Data) { # Write-Host ("Key: " + $row.MyKey.ToString()); if($list.ContainsKey($row.MyKey) -eq $true) { $s = $list.Item($row.MyKey); $s.Sum += $row.Value1; $s.Count++; } else { $s = New-Object PSObject; $s | Add-Member -Type NoteProperty -Name MyKey -Value $row.MyKey; $s | Add-Member -type NoteProperty -Name Sum -Value $row.Value1; $list.Add($row.MyKey, $s); } } Write-Output $list; } $MyReduce = { Param ( [object] $key, [PSObject] $dataset ) Write-Host ($env:computername + "::Reduce - Count: " + $dataset.Data.Count) $redux = @{}; $count = 0; foreach($s in $dataset.Data) { $sum += $s.Sum; $count += 1; } # Reduce $redux.Add($s.MyKey, $sum / $count); # Return Write-Output $redux; } # Create the item data $Mr = New-MapReduxItem "My Test MapReduce Job" $MyMap $MyReduce # Array of processing nodes... $MyNodes = ("node1", "node2", "node3", "node4", "localhost") # Run the Map Reduce routine... $MyMrResults = Invoke-MapRedux -MapReduceItem $Mr -ComputerName $MyNodes -DataSet $dataset -Verbose # Show the results Set-Location C:\ $MyMrResults | Out-GridView Conclusion I hope you have seen through this article that PowerShell has a significant infrastructure available for distributed computing. While it does take some code to expose a MapReduce-style framework, much of the work is already done and PowerShell could prove to be the the easiest platform to develop and run big data jobs in your corporate data center, potentially in the Azure cloud, or certainly as an academic excerise at home or school. Follow me on Twitter to stay up to date on the continuing progress of my Powershell MapRedux module, and thanks for reading! Daniel

    Read the article

  • how to get rid of light desktop manager entry in ubuntu greeter

    - by user205162
    Got "light desktop manager" entry in ubuntu greeter for some strange reason and can't get rid of it. As the current OS is Ubunu 13.10 the way to configure the lightdm greeter is rather strange, can't find any solution in greeter configs. More annoying thing is that Mr.LDM seems to have password very hard to guess so I can't login to this "account". Can you help me to find source of this problem and find the solution?

    Read the article

  • Streaming Audio over UDP to Android

    - by Mr. Pig
    Is it possible to have Android (perhaps via MediaPlayer or a different existing class) accept media streams over UDP? I've successfully had MediaPlayer connect to an HTTP stream (as well as static files hosted on an HTTP server) but I'm wondering how one would go about accepting a stream from a UDP source. I've seen this and suppose a solution similar to that (where I download the stream via an independent UDP socket and then move the data to a MemoryBuffer that I then pass to MediaPlayer) is an option but I'm curious if a method already exists in the SDK, and if it does not, what other options do I have? Thanks

    Read the article

  • Asp.net MVC ModelState.Clear

    - by Mr Grok
    Can anyone give me a succinct definition of the role of ModelState in Asp.net MVC (or a link to one). In particular I need to know in what situations it is necessary or desirable to call ModelState.Clear(). Can anyone give me a succinct definition of the role of ModelState in Asp.net MVC (or a link to one). In particular I need to know in what situations it is necessary or desirable to call ModelState.Clear(). Bit open ended huh... sorry, I think it might help if tell you what I'm acutally doing: I have an Action of Edit on a Controller called "Page". When I first see the form to change the Page's details everything loads up fine (binding to a "MyCmsPage" object). Then I click a button that generates a value for one of the MyCmsPage object's fields (MyCmsPage.SeoTitle). It generates fine and updates the object and I then return the action result with the newly modified page object and expect the relevant textbox (rendered using <%= Html.TextBox("seoTitle", page.SeoTitle)%) to be updated ... but alas it displays the value from the old model that was loaded. I've worked around it by using ModelState.Clear() but I need to know why / how it has worked so I'm not just doing it blindly. PageController: [AcceptVerbs("POST")] public ActionResult Edit(MyCmsPage page, string submitButton) { // add the seoTitle to the current page object page.GenerateSeoTitle(); // why must I do this? ModelState.Clear(); // return the modified page object return View(page); } Aspx: <%@ Page Language="C#" MasterPageFile="~/Views/Shared/Site.Master" Inherits="System.Web.Mvc.ViewPage<MyCmsPage>" %> .... <div class="c"> <label for="seoTitle"> Seo Title</label> <%= Html.TextBox("seoTitle", page.SeoTitle)%> <input type="submit" value="Generate Seo Title" name="submitButton" /> </div>

    Read the article

  • Example WCF XML-RPC client C# code against custom XML-RPC server implementation?

    - by mr.b
    I have built my own little custom XML-RPC server, and since I'd like to keep things simple, on both server and client side (server side PHP runs, by the way), what I would like to accomplish is to create a simplest possible client in C# using WCF. Let's say that Contract for service exposed via XML-RPC is as follows. [ServiceContract] public interface IContract { [OperationContract(Action="Ping")] string Ping(); // server returns back string "Pong" [OperationContract(Action="Echo")] string Echo(string message); // server echoes back whatever message is } So, there are two example methods, one without any arguments, and another with simple string argument, both returning strings (just for sake of example). Service is exposed via http. What's next? Thanks for reading! P.S. I have done my homework of googling around for samples and similar, but all that I could come up with are some blog-related samples that use existing (and very big) classes, which implement correct IContract (or IBlogger) interfaces, so that most of what I am interested is hidden below several layers of abstraction...

    Read the article

  • Java POI 3.6 XWPF usage guidelines (reading content of docx file)

    - by Mr CooL
    I assume the following objects should be used to read contents of DOCX file: XWPFDocument XWPFWordExtractor However, somewhere the compiler warns me from not including the correct libraries needed in classpath. I think I'm kinda lost for not knowing which jar file is the right one to include for this since there are so many jar files (POI libraries). My project so far involve in reading doc and docx files as part of the project. I've managed to read the contents of doc file. However, for docx file, I'm still having problem with that. Can anyone show the guidelines in terms of the codes and libraries needed (jar files) to read the content of docx file? I'm trying to limit the libraries need to be added on into project since I need to read doc and docx only. The following works for doc: fs = new POIFSFileSystem(new FileInputStream(fileName)); HWPFDocument doc = new HWPFDocument(fs); WordExtractor we = new WordExtractor(doc); String[] p = we.getParagraphText();

    Read the article

  • php variable scope in oop

    - by mr o
    Hi, I wonder if anyone can help out here, I'm trying to understand how use an objects properties across multiple non class pages,but I can't seem to be able to get my head around everything i have tried so far. For example a class called person; class person { static $name; } but i have a number of different regular pages that want to utilize $name across the board. I have trying things like this; pageone.php include "person.php"; $names = new Person(); echo person::$name; names::$name='bob'; pagetwo.php include "person.php"; echo person::$name; I can work with classes to the extent I'm OK as long as I am creating new instances every page, but how can make the properties of one object available to all, like a shared variable ? Thanks

    Read the article

  • How to consume PHP SOAP service using WCF

    - by mr.b
    I am new in web services so apologize me if I am making some cardinal mistake here, hehe. I have built SOAP service using PHP. Service is SOAP 1.2 compatible, and I have WSDL available. I have enabled sessions, so that I can track login status, etc. I don't need some super security here (ie message-level security), all I need is transport security (HTTPS), since this service will be used infrequently, and performances are not so much of an issue. I am having difficulties making it to work at all. C# throws some unclear exception ("Server returned an invalid SOAP Fault. Please see InnerException for more details.", which in turn says "Unbound prefix used in qualified name 'rpc:ProcedureNotPresent'."), but consuming service using PHP SOAP client behaves as expected (including session and all). So far, I have following code. note: due to amount of real code, I am posting minimal code configuration PHP SOAP server (using Zend Soap Server library), including class(es) exposed via service: <?php class Verification_LiteralDocumentProxy { protected $instance; public function __call($methodName, $args) { if ($this->instance === null) { $this->instance = new Verification(); } $result = call_user_func_array(array($this->instance, $methodName), $args[0]); return array($methodName.'Result' => $result); } } class Verification { private $guid = ''; private $hwid = ''; /** * Initialize connection * * @param string GUID * @param string HWID * @return bool */ public function Initialize($guid, $hwid) { $this->guid = $guid; $this->hwid = $hwid; return true; } /** * Closes session * * @return void */ public function Close() { // if session is working, $this->hwid and $this->guid // should contain non-empty values } } // start up session stuff $sess = Session::instance(); require_once 'Zend/Soap/Server.php'; $server = new Zend_Soap_Server('https://www.somesite.com/api?wsdl'); $server->setClass('Verification_LiteralDocumentProxy'); $server->setPersistence(SOAP_PERSISTENCE_SESSION); $server->handle(); WSDL: <definitions xmlns="http://schemas.xmlsoap.org/wsdl/" xmlns:tns="https://www.somesite.com/api" xmlns:soap="http://schemas.xmlsoap.org/wsdl/soap/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soap-enc="http://schemas.xmlsoap.org/soap/encoding/" xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" name="Verification" targetNamespace="https://www.somesite.com/api"> <types> <xsd:schema targetNamespace="https://www.somesite.com/api"> <xsd:element name="Initialize"> <xsd:complexType> <xsd:sequence> <xsd:element name="guid" type="xsd:string"/> <xsd:element name="hwid" type="xsd:string"/> </xsd:sequence> </xsd:complexType> </xsd:element> <xsd:element name="InitializeResponse"> <xsd:complexType> <xsd:sequence> <xsd:element name="InitializeResult" type="xsd:boolean"/> </xsd:sequence> </xsd:complexType> </xsd:element> <xsd:element name="Close"> <xsd:complexType/> </xsd:element> </xsd:schema> </types> <portType name="VerificationPort"> <operation name="Initialize"> <documentation> Initializes connection with server</documentation> <input message="tns:InitializeIn"/> <output message="tns:InitializeOut"/> </operation> <operation name="Close"> <documentation> Closes session between client and server</documentation> <input message="tns:CloseIn"/> </operation> </portType> <binding name="VerificationBinding" type="tns:VerificationPort"> <soap:binding style="document" transport="http://schemas.xmlsoap.org/soap/http"/> <operation name="Initialize"> <soap:operation soapAction="https://www.somesite.com/api#Initialize"/> <input> <soap:body use="literal"/> </input> <output> <soap:body use="literal"/> </output> </operation> <operation name="Close"> <soap:operation soapAction="https://www.somesite.com/api#Close"/> <input> <soap:body use="literal"/> </input> <output> <soap:body use="literal"/> </output> </operation> </binding> <service name="VerificationService"> <port name="VerificationPort" binding="tns:VerificationBinding"> <soap:address location="https://www.somesite.com/api"/> </port> </service> <message name="InitializeIn"> <part name="parameters" element="tns:Initialize"/> </message> <message name="InitializeOut"> <part name="parameters" element="tns:InitializeResponse"/> </message> <message name="CloseIn"> <part name="parameters" element="tns:Close"/> </message> </definitions> And finally, WCF C# consumer code: [ServiceContract(SessionMode = SessionMode.Required)] public interface IVerification { [OperationContract(Action = "Initialize", IsInitiating = true)] bool Initialize(string guid, string hwid); [OperationContract(Action = "Close", IsInitiating = false, IsTerminating = true)] void Close(); } class Program { static void Main(string[] args) { WSHttpBinding whb = new WSHttpBinding(SecurityMode.Transport, true); ChannelFactory<IVerification> cf = new ChannelFactory<IVerification>( whb, "https://www.somesite.com/api"); IVerification client = cf.CreateChannel(); Console.WriteLine(client.Initialize("123451515", "15498518").ToString()); client.Close(); } } Any ideas? What am I doing wrong here? Thanks!

    Read the article

  • How to get BinarySecurityToken into the wcf soap request

    - by Mr Bell
    I need to sign my soap request to a 3rd party. The provided an example what the call should look like. And I am trying, rather unsuccessfully to make this call with wcf. I need to make a wcf soap call where the header contains BinarySecurityToken, Signature, and SecurityTokenReference. Here is the example they sent me (with some of the values omitted) I have a certificate for signing, but I cant for the life of me figure out how to make this work <?xml version="1.0" encoding="UTF-8"?> <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd" xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><soapenv:Header><wsse:Security xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd"> <wsse:BinarySecurityToken EncodingType="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-soap-message-security-1.0#Base64Binary" ValueType="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-x509-token-profile-1.0#X509v3" wsu:Id="SecurityToken-..omitted.." xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd">..omitted..</wsse:BinarySecurityToken> <ds:Signature xmlns:ds="http://www.w3.org/2000/09/xmldsig#"> <ds:SignedInfo> <ds:CanonicalizationMethod Algorithm="http://www.w3.org/2001/10/xml-exc-c14n#"/> <ds:SignatureMethod Algorithm="http://www.w3.org/2000/09/xmldsig#rsa-sha1"/> <ds:Reference URI="#Body"> <ds:Transforms> <ds:Transform Algorithm="http://www.w3.org/2001/10/xml-exc-c14n#"/> </ds:Transforms> <ds:DigestMethod Algorithm="http://www.w3.org/2000/09/xmldsig#sha1"/> <ds:DigestValue>..omitted...</ds:DigestValue> </ds:Reference> </ds:SignedInfo> <ds:SignatureValue> ..omitted.. </ds:SignatureValue> <ds:KeyInfo><wsse:SecurityTokenReference><wsse:Reference URI="#SecurityToken-..omitted.." ValueType="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-x509-token-profile-1.0#X509v3"/></wsse:SecurityTokenReference></ds:KeyInfo></ds:Signature></wsse:Security></soapenv:Header><soapenv:Body wsu:Id="Body"><in0 xmlns="http://test.3rdParty.com">123</in0></soapenv:Body></soapenv:Envelope>

    Read the article

  • How do i specify wcf behaviorExtension class type without the assembly version number?

    - by Mr Bell
    I have a web app that uses a WCF service that utilizes a behaviorExtension like so: <behaviorExtensions> <add name="clientCredentialsExtension" type="Simon.Web.Giftcard.WCFSecurity.ClientCredentialsExtensionElement, Simon.Web.Giftcard, Version=1.0.3736.20411, Culture=neutral, PublicKeyToken=null"/> </behaviorExtensions> The problem is this web app's version changes with every compile (i think) and thus invalidating this entry. How can I avoid having to change the version number every time I compile this? Can I specify the extension in code somewhere?

    Read the article

  • Tutorial: Simple WCF XML-RPC client

    - by mr.b
    Update: I have provided complete code example in answer below. I have built my own little custom XML-RPC server, and since I'd like to keep things simple, on both server and client side, what I would like to accomplish is to create a simplest possible client (in C# preferably) using WCF. Let's say that Contract for service exposed via XML-RPC is as follows: [ServiceContract] public interface IContract { [OperationContract(Action="Ping")] string Ping(); // server returns back string "Pong" [OperationContract(Action="Echo")] string Echo(string message); // server echoes back whatever message is } So, there are two example methods, one without any arguments, and another with simple string argument, both returning strings (just for sake of example). Service is exposed via http. Aaand, what's next? :)

    Read the article

  • Linq 2 SQL One to Zero or One relationship possible?

    - by Mr. Flibble
    Is it possible to create a one to zero or one relationship in Linq2SQL? My understanding is that to create a one to one relationship you create a FK relationship on the PK of each table. But you cannot make the PK nullable, so I don't see how to make a one to zero or one relationship work? I'm using the designer to automatically create the model - so I would like to know how to set up the SQL tables to induce the relationship - not some custom ORM code.

    Read the article

  • Simple iPhone drawing app with Quartz 2D

    - by Mr guy 4
    I am making a simple iPhone drawing program as a personal side-project. I capture touches event in a subclassed UIView and render the actual stuff to a seperate CGLayer. After each render, I call [self setNeedsLayout] and in the drawRect: method I draw the CGLayer to the screen context. This all works great and performs decently for drawing rectangles. However, I just want a simple "freehand" mode like a lot of other iPhone applications have. The way I thought to do this was to create a CGMutablePath, and simply: CGMutablePathRef path; -(void)touchBegan { path = CGMutablePathCreate(); } -(void)touchMoved { CGPathMoveToPoint(path,NULL,x,y); CGPathAddLineToPoint(path,NULL,x,y); } -(void)drawRect:(CGContextRef)context { CGContextBeginPath(context); CGContextAddPath(context,path); CGContextStrokePath(context); } However, after drawing for more than 1 second, performance degrades miserably. I would just draw each line into the off-screen CGLayer, if it were not for variable opacity! The less-than-100% opacity causes dots to be left on the screen connecting the lines. I have looked at CGContextSetBlendingMode() but alas I cannot find an answer. Can anyone point me in the right direction? Other iPhone apps are able to do this with very good efficiency.

    Read the article

  • Android edtftpj/PRo SFTP heap worker problem

    - by Mr. Kakakuwa Bird
    Hi I am using edtftpj-pro3.1 trial copy in my android app to make SFTP connection with the server. After few connections with the server with 5-6 file transfers, my app is crashing with following exception. Is it causing the problem or what could be the problem?? I tried setParallelMode(false) in SSHFTPClient, but it is not working. Exception i'm getting is, 05-31 18:28:12.661: ERROR/dalvikvm(589): HeapWorker is wedged: 10173ms spent inside Lcom/enterprisedt/net/j2ssh/sftp/SftpFileInputStream;.finalize()V 05-31 18:28:12.661: INFO/dalvikvm(589): DALVIK THREADS: 05-31 18:28:12.661: INFO/dalvikvm(589): "main" prio=5 tid=3 WAIT 05-31 18:28:12.661: INFO/dalvikvm(589): | group="main" sCount=1 dsCount=0 s=N obj=0x4001b260 self=0xbd18 05-31 18:28:12.661: INFO/dalvikvm(589): | sysTid=589 nice=0 sched=0/0 cgrp=default handle=-1343993192 05-31 18:28:12.661: INFO/dalvikvm(589): at java.lang.Object.wait(Native Method) 05-31 18:28:12.661: INFO/dalvikvm(589): - waiting on <0x122d70 (a android.os.MessageQueue) 05-31 18:28:12.661: INFO/dalvikvm(589): at java.lang.Object.wait(Object.java:288) 05-31 18:28:12.661: INFO/dalvikvm(589): at android.os.MessageQueue.next(MessageQueue.java:148) 05-31 18:28:12.661: INFO/dalvikvm(589): at android.os.Looper.loop(Looper.java:110) 05-31 18:28:12.661: INFO/dalvikvm(589): at android.app.ActivityThread.main(ActivityThread.java:4363) 05-31 18:28:12.661: INFO/dalvikvm(589): at java.lang.reflect.Method.invokeNative(Native Method) 05-31 18:28:12.661: INFO/dalvikvm(589): at java.lang.reflect.Method.invoke(Method.java:521) 05-31 18:28:12.661: INFO/dalvikvm(589): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:860) 05-31 18:28:12.661: INFO/dalvikvm(589): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:618) 05-31 18:28:12.661: INFO/dalvikvm(589): at dalvik.system.NativeStart.main(Native Method) 05-31 18:28:12.671: INFO/dalvikvm(589): "Transport protocol 1" daemon prio=5 tid=29 NATIVE 05-31 18:28:12.671: INFO/dalvikvm(589): | group="main" sCount=1 dsCount=0 s=N obj=0x44774768 self=0x3a7938 05-31 18:28:12.671: INFO/dalvikvm(589): | sysTid=605 nice=0 sched=0/0 cgrp=default handle=3834600 05-31 18:28:12.671: INFO/dalvikvm(589): at org.apache.harmony.luni.platform.OSNetworkSystem.receiveStreamImpl(Native Method) 05-31 18:28:12.671: INFO/dalvikvm(589): at org.apache.harmony.luni.platform.OSNetworkSystem.receiveStream(OSNetworkSystem.java:478) 05-31 18:28:12.671: INFO/dalvikvm(589): at org.apache.harmony.luni.net.PlainSocketImpl.read(PlainSocketImpl.java:565) 05-31 18:28:12.671: INFO/dalvikvm(589): at org.apache.harmony.luni.net.SocketInputStream.read(SocketInputStream.java:87) 05-31 18:28:12.671: INFO/dalvikvm(589): at org.apache.harmony.luni.net.SocketInputStream.read(SocketInputStream.java:67) 05-31 18:28:12.671: INFO/dalvikvm(589): at java.io.BufferedInputStream.fillbuf(BufferedInputStream.java:157) 05-31 18:28:12.671: INFO/dalvikvm(589): at java.io.BufferedInputStream.read(BufferedInputStream.java:346) 05-31 18:28:12.671: INFO/dalvikvm(589): at java.io.BufferedInputStream.read(BufferedInputStream.java:341) 05-31 18:28:12.671: INFO/dalvikvm(589): at com.enterprisedt.net.j2ssh.transport.A.A((null):-1) 05-31 18:28:12.671: INFO/dalvikvm(589): at com.enterprisedt.net.j2ssh.transport.A.B((null):-1) 05-31 18:28:12.671: INFO/dalvikvm(589): at com.enterprisedt.net.j2ssh.transport.TransportProtocolCommon.processMessages((null):-1) 05-31 18:28:12.671: INFO/dalvikvm(589): at com.enterprisedt.net.j2ssh.transport.TransportProtocolCommon.startBinaryPacketProtocol((null):-1) 05-31 18:28:12.671: INFO/dalvikvm(589): at com.enterprisedt.net.j2ssh.transport.TransportProtocolCommon.run((null):-1) 05-31 18:28:12.671: INFO/dalvikvm(589): at java.lang.Thread.run(Thread.java:1096) 05-31 18:28:12.671: INFO/dalvikvm(589): "StreamFrameSender" prio=5 tid=27 TIMED_WAIT 05-31 18:28:12.671: INFO/dalvikvm(589): | group="main" sCount=1 dsCount=0 s=N obj=0x44750a60 self=0x3964d8 05-31 18:28:12.671: INFO/dalvikvm(589): | sysTid=603 nice=0 sched=0/0 cgrp=default handle=3761648 05-31 18:28:12.671: INFO/dalvikvm(589): at java.lang.Object.wait(Native Method) 05-31 18:28:12.671: INFO/dalvikvm(589): - waiting on <0x399478 (a com.corventis.gateway.ppp.StreamFrameSender) 05-31 18:28:12.671: INFO/dalvikvm(589): at java.lang.Object.wait(Object.java:326) 05-31 18:28:12.671: INFO/dalvikvm(589): at com.corventis.gateway.ppp.StreamFrameSender.run(StreamFrameSender.java:154) 05-31 18:28:12.671: INFO/dalvikvm(589): at com.corventis.gateway.util.MonitoredRunnable.run(MonitoredRunnable.java:41) 05-31 18:28:12.671: INFO/dalvikvm(589): at java.lang.Thread.run(Thread.java:1096) 05-31 18:28:12.671: INFO/dalvikvm(589): "SftpActiveWorker" prio=5 tid=25 TIMED_WAIT 05-31 18:28:12.671: INFO/dalvikvm(589): | group="main" sCount=1 dsCount=0 s=N obj=0x447522b0 self=0x398e00 05-31 18:28:12.671: INFO/dalvikvm(589): | sysTid=604 nice=0 sched=0/0 cgrp=default handle=3762704 05-31 18:28:12.671: INFO/dalvikvm(589): at java.lang.Object.wait(Native Method) 05-31 18:28:12.671: INFO/dalvikvm(589): - waiting on <0x3962d8 (a com.corventis.gateway.hostcommunicator.SftpActiveWorker) 05-31 18:28:12.671: INFO/dalvikvm(589): at java.lang.Object.wait(Object.java:326) 05-31 18:28:12.671: INFO/dalvikvm(589): at com.corventis.gateway.hostcommunicator.SftpActiveWorker.run(SftpActiveWorker.java:151) 05-31 18:28:12.671: INFO/dalvikvm(589): at com.corventis.gateway.util.MonitoredRunnable.run(MonitoredRunnable.java:41) 05-31 18:28:12.671: INFO/dalvikvm(589): at java.lang.Thread.run(Thread.java:1096) 05-31 18:28:12.671: INFO/dalvikvm(589): "Thread-12" prio=5 tid=23 NATIVE 05-31 18:28:12.671: INFO/dalvikvm(589): | group="main" sCount=1 dsCount=0 s=N obj=0x4474aca8 self=0x115690 05-31 18:28:12.671: INFO/dalvikvm(589): | sysTid=602 nice=0 sched=0/0 cgrp=default handle=878120 05-31 18:28:12.671: INFO/dalvikvm(589): at android.bluetooth.BluetoothSocket.acceptNative(Native Method) 05-31 18:28:12.681: INFO/dalvikvm(589): at android.bluetooth.BluetoothSocket.accept(BluetoothSocket.java:287) 05-31 18:28:12.681: INFO/dalvikvm(589): at android.bluetooth.BluetoothServerSocket.accept(BluetoothServerSocket.java:105) 05-31 18:28:12.681: INFO/dalvikvm(589): at android.bluetooth.BluetoothServerSocket.accept(BluetoothServerSocket.java:91) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.corventis.gateway.bluetooth.BluetoothManager.openPort(BluetoothManager.java:215) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.corventis.gateway.bluetooth.BluetoothManager.open(BluetoothManager.java:84) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.corventis.gateway.patchcommunicator.PatchCommunicator.open(PatchCommunicator.java:123) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.corventis.gateway.patchcommunicator.PatchCommunicatorRunnable.run(PatchCommunicatorRunnable.java:134) 05-31 18:28:12.681: INFO/dalvikvm(589): at java.lang.Thread.run(Thread.java:1096) 05-31 18:28:12.681: INFO/dalvikvm(589): "HfGatewayApplication" prio=5 tid=21 RUNNABLE 05-31 18:28:12.681: INFO/dalvikvm(589): | group="main" sCount=0 dsCount=0 s=N obj=0x4472d9b0 self=0x120928 05-31 18:28:12.681: INFO/dalvikvm(589): | sysTid=601 nice=0 sched=0/0 cgrp=default handle=1264672 05-31 18:28:12.681: INFO/dalvikvm(589): at com.jcraft.jzlib.Deflate.deflateInit2(Deflate.java:~1361) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.jcraft.jzlib.Deflate.deflateInit(Deflate.java:1316) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.jcraft.jzlib.ZStream.deflateInit(ZStream.java:127) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.jcraft.jzlib.ZStream.deflateInit(ZStream.java:120) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.jcraft.jzlib.ZOutputStream.(ZOutputStream.java:62) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.corventis.gateway.zipfile.ZipStorer.addStream(ZipStorer.java:211) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.corventis.gateway.zipfile.ZipStorer.createZip(ZipStorer.java:127) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.corventis.gateway.hostcommunicator.HostCommunicator.scanAndCompress(HostCommunicator.java:453) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.corventis.gateway.hostcommunicator.HostCommunicator.doWork(HostCommunicator.java:1434) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.corventis.gateway.hf.HfGatewayApplication.doWork(HfGatewayApplication.java:621) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.corventis.gateway.hf.HfGatewayApplication.run(HfGatewayApplication.java:546) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.corventis.gateway.util.MonitoredRunnable.run(MonitoredRunnable.java:41) 05-31 18:28:12.681: INFO/dalvikvm(589): at java.lang.Thread.run(Thread.java:1096) 05-31 18:28:12.681: INFO/dalvikvm(589): "Thread-10" prio=5 tid=19 TIMED_WAIT 05-31 18:28:12.681: INFO/dalvikvm(589): | group="main" sCount=1 dsCount=0 s=N obj=0x447287f8 self=0x1451b8 05-31 18:28:12.681: INFO/dalvikvm(589): | sysTid=598 nice=0 sched=0/0 cgrp=default handle=1331920 05-31 18:28:12.681: INFO/dalvikvm(589): at java.lang.VMThread.sleep(Native Method) 05-31 18:28:12.681: INFO/dalvikvm(589): at java.lang.Thread.sleep(Thread.java:1306) 05-31 18:28:12.681: INFO/dalvikvm(589): at java.lang.Thread.sleep(Thread.java:1286) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.corventis.gateway.util.Watchdog.run(Watchdog.java:167) 05-31 18:28:12.681: INFO/dalvikvm(589): at java.lang.Thread.run(Thread.java:1096) 05-31 18:28:12.681: INFO/dalvikvm(589): "Thread-9" prio=5 tid=17 RUNNABLE 05-31 18:28:12.681: INFO/dalvikvm(589): | group="main" sCount=1 dsCount=0 s=Y obj=0x44722c90 self=0x114e20 05-31 18:28:12.681: INFO/dalvikvm(589): | sysTid=597 nice=0 sched=0/0 cgrp=default handle=1200048 05-31 18:28:12.681: INFO/dalvikvm(589): at com.corventis.gateway.time.Time.currentTimeMillis(Time.java:~77) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.corventis.gateway.patchcommunicator.PatchCommunicatorState$1.run(PatchCommunicatorState.java:27) 05-31 18:28:12.681: INFO/dalvikvm(589): "Thread-8" prio=5 tid=15 RUNNABLE 05-31 18:28:12.681: INFO/dalvikvm(589): | group="main" sCount=1 dsCount=0 s=Y obj=0x44722430 self=0x124dd0 05-31 18:28:12.681: INFO/dalvikvm(589): | sysTid=596 nice=0 sched=0/0 cgrp=default handle=1199848 05-31 18:28:12.681: INFO/dalvikvm(589): at com.corventis.gateway.time.Time.currentTimeMillis(Time.java:~80) 05-31 18:28:12.681: INFO/dalvikvm(589): at com.corventis.gateway.hostcommunicator.HostCommunicatorState$1.run(HostCommunicatorState.java:35) 05-31 18:28:12.681: INFO/dalvikvm(589): "Binder Thread #2" prio=5 tid=13 NATIVE 05-31 18:28:12.681: INFO/dalvikvm(589): | group="main" sCount=1 dsCount=0 s=N obj=0x4471ccc0 self=0x149b60 05-31 18:28:12.681: INFO/dalvikvm(589): | sysTid=595 nice=0 sched=0/0 cgrp=default handle=1317992 05-31 18:28:12.681: INFO/dalvikvm(589): at dalvik.system.NativeStart.run(Native Method) 05-31 18:28:12.681: INFO/dalvikvm(589): "Binder Thread #1" prio=5 tid=11 NATIVE 05-31 18:28:12.681: INFO/dalvikvm(589): | group="main" sCount=1 dsCount=0 s=N obj=0x447159a8 self=0x123298 05-31 18:28:12.681: INFO/dalvikvm(589): | sysTid=594 nice=0 sched=0/0 cgrp=default handle=1164896 05-31 18:28:12.681: INFO/dalvikvm(589): at dalvik.system.NativeStart.run(Native Method) 05-31 18:28:12.691: INFO/dalvikvm(589): "JDWP" daemon prio=5 tid=9 VMWAIT 05-31 18:28:12.691: INFO/dalvikvm(589): | group="system" sCount=1 dsCount=0 s=N obj=0x4470f2a0 self=0x141a90 05-31 18:28:12.691: INFO/dalvikvm(589): | sysTid=593 nice=0 sched=0/0 cgrp=default handle=1316864 05-31 18:28:12.691: INFO/dalvikvm(589): at dalvik.system.NativeStart.run(Native Method) 05-31 18:28:12.691: INFO/dalvikvm(589): "Signal Catcher" daemon prio=5 tid=7 VMWAIT 05-31 18:28:12.691: INFO/dalvikvm(589): | group="system" sCount=1 dsCount=0 s=N obj=0x4470f1e8 self=0x124970 05-31 18:28:12.691: INFO/dalvikvm(589): | sysTid=592 nice=0 sched=0/0 cgrp=default handle=1316800 05-31 18:28:12.691: INFO/dalvikvm(589): at dalvik.system.NativeStart.run(Native Method) 05-31 18:28:12.691: INFO/dalvikvm(589): "HeapWorker" daemon prio=5 tid=5 MONITOR 05-31 18:28:12.691: INFO/dalvikvm(589): | group="system" sCount=1 dsCount=0 s=N obj=0x431b4550 self=0x141670 05-31 18:28:12.691: INFO/dalvikvm(589): | sysTid=591 nice=0 sched=0/0 cgrp=default handle=1316400 05-31 18:28:12.691: INFO/dalvikvm(589): at com.enterprisedt.net.j2ssh.sftp.SftpSubsystemClient.closeHandle((null):~-1) 05-31 18:28:12.691: INFO/dalvikvm(589): at com.enterprisedt.net.j2ssh.sftp.SftpSubsystemClient.closeFile((null):-1) 05-31 18:28:12.691: INFO/dalvikvm(589): at com.enterprisedt.net.j2ssh.sftp.SftpFile.close((null):-1) 05-31 18:28:12.691: INFO/dalvikvm(589): at com.enterprisedt.net.j2ssh.sftp.SftpFileInputStream.close((null):-1) 05-31 18:28:12.691: INFO/dalvikvm(589): at com.enterprisedt.net.j2ssh.sftp.SftpFileInputStream.finalize((null):-1) 05-31 18:28:12.691: INFO/dalvikvm(589): at dalvik.system.NativeStart.run(Native Method) 05-31 18:28:12.691: ERROR/dalvikvm(589): VM aborting 05-31 18:28:12.801: INFO/DEBUG(49): * ** * ** * ** * ** * ** * 05-31 18:28:12.801: INFO/DEBUG(49): Build fingerprint: 'google/passion/passion/mahimahi:2.1-update1/ERE27/24178:user/release-keys' 05-31 18:28:12.801: INFO/DEBUG(49): pid: 589, tid: 601 com.corventis.gateway.hf <<< 05-31 18:28:12.801: INFO/DEBUG(49): signal 11 (SIGSEGV), fault addr deadd00d 05-31 18:28:12.801: INFO/DEBUG(49): r0 00000026 r1 afe13329 r2 afe13329 r3 00000000 05-31 18:28:12.801: INFO/DEBUG(49): r4 ad081f50 r5 400091e8 r6 009b3a6a r7 00000000 05-31 18:28:12.801: INFO/DEBUG(49): r8 000002e8 r9 ad082ba0 10 ad082ba0 fp 00000000 05-31 18:28:12.801: INFO/DEBUG(49): ip deadd00d sp 46937c58 lr afe14373 pc ad035b4c cpsr 20000030 05-31 18:28:12.851: INFO/DEBUG(49): #00 pc 00035b4c /system/lib/libdvm.so 05-31 18:28:12.861: INFO/DEBUG(49): #01 pc 00044d7c /system/lib/libdvm.so 05-31 18:28:12.861: INFO/DEBUG(49): #02 pc 000162e4 /system/lib/libdvm.so 05-31 18:28:12.861: INFO/DEBUG(49): #03 pc 00016b60 /system/lib/libdvm.so 05-31 18:28:12.861: INFO/DEBUG(49): #04 pc 00016ce0 /system/lib/libdvm.so 05-31 18:28:12.861: INFO/DEBUG(49): #05 pc 00057b64 /system/lib/libdvm.so 05-31 18:28:12.861: INFO/DEBUG(49): #06 pc 00057cc0 /system/lib/libdvm.so 05-31 18:28:12.871: INFO/DEBUG(49): #07 pc 00057dd4 /system/lib/libdvm.so 05-31 18:28:12.871: INFO/DEBUG(49): #08 pc 00012ffc /system/lib/libdvm.so 05-31 18:28:12.871: INFO/DEBUG(49): #09 pc 00019338 /system/lib/libdvm.so 05-31 18:28:12.871: INFO/DEBUG(49): #10 pc 00018804 /system/lib/libdvm.so 05-31 18:28:12.871: INFO/DEBUG(49): #11 pc 0004eed0 /system/lib/libdvm.so 05-31 18:28:12.871: INFO/DEBUG(49): #12 pc 0004eef8 /system/lib/libdvm.so 05-31 18:28:12.871: INFO/DEBUG(49): #13 pc 000426d4 /system/lib/libdvm.so 05-31 18:28:12.881: INFO/DEBUG(49): #14 pc 0000fd74 /system/lib/libc.so 05-31 18:28:12.881: INFO/DEBUG(49): #15 pc 0000f840 /system/lib/libc.so 05-31 18:28:12.881: INFO/DEBUG(49): code around pc: 05-31 18:28:12.881: INFO/DEBUG(49): ad035b3c 58234808 b1036b9b f8df4798 2026c01c 05-31 18:28:12.881: INFO/DEBUG(49): ad035b4c 0000f88c ef52f7d8 0004c428 fffe631c 05-31 18:28:12.881: INFO/DEBUG(49): ad035b5c fffe94f4 000002f8 deadd00d f8dfb40e 05-31 18:28:12.881: INFO/DEBUG(49): code around lr: 05-31 18:28:12.881: INFO/DEBUG(49): afe14360 686768a5 f9b5e008 b120000c 46289201 05-31 18:28:12.881: INFO/DEBUG(49): afe14370 9a014790 35544306 37fff117 6824d5f3 05-31 18:28:12.881: INFO/DEBUG(49): afe14380 d1ed2c00 bdfe4630 00026ab0 000000b4 05-31 18:28:12.881: INFO/DEBUG(49): stack: 05-31 18:28:12.881: INFO/DEBUG(49): 46937c18 00000015 05-31 18:28:12.881: INFO/DEBUG(49): 46937c1c afe13359 /system/lib/libc.so 05-31 18:28:12.881: INFO/DEBUG(49): 46937c20 afe3b02c /system/lib/libc.so 05-31 18:28:12.891: INFO/DEBUG(49): 46937c24 afe3afd8 /system/lib/libc.so 05-31 18:28:12.891: INFO/DEBUG(49): 46937c28 00000000 05-31 18:28:12.891: INFO/DEBUG(49): 46937c2c afe14373 /system/lib/libc.so 05-31 18:28:12.891: INFO/DEBUG(49): 46937c30 afe13329 /system/lib/libc.so 05-31 18:28:12.891: INFO/DEBUG(49): 46937c34 afe13329 /system/lib/libc.so 05-31 18:28:12.891: INFO/DEBUG(49): 46937c38 afe13380 /system/lib/libc.so 05-31 18:28:12.891: INFO/DEBUG(49): 46937c3c ad081f50 /system/lib/libdvm.so 05-31 18:28:12.891: INFO/DEBUG(49): 46937c40 400091e8 /dev/ashmem/mspace/dalvik-heap/zygote/0 (deleted) 05-31 18:28:12.891: INFO/DEBUG(49): 46937c44 009b3a6a 05-31 18:28:12.891: INFO/DEBUG(49): 46937c48 00000000 05-31 18:28:12.891: INFO/DEBUG(49): 46937c4c afe1338d /system/lib/libc.so 05-31 18:28:12.891: INFO/DEBUG(49): 46937c50 df002777 05-31 18:28:12.891: INFO/DEBUG(49): 46937c54 e3a070ad 05-31 18:28:12.891: INFO/DEBUG(49): #00 46937c58 ad06f573 /system/lib/libdvm.so 05-31 18:28:12.891: INFO/DEBUG(49): 46937c5c ad044d81 /system/lib/libdvm.so 05-31 18:28:12.891: INFO/DEBUG(49): #01 46937c60 000027bd 05-31 18:28:12.891: INFO/DEBUG(49): 46937c64 00000000 05-31 18:28:12.891: INFO/DEBUG(49): 46937c68 463b6ab4 /data/dalvik-cache/data@[email protected]@classes.dex 05-31 18:28:12.891: INFO/DEBUG(49): 46937c6c 463d1ecf /data/dalvik-cache/data@[email protected]@classes.dex 05-31 18:28:12.891: INFO/DEBUG(49): 46937c70 00140450 [heap] 05-31 18:28:12.891: INFO/DEBUG(49): 46937c74 ad041d2b /system/lib/libdvm.so 05-31 18:28:12.891: INFO/DEBUG(49): 46937c78 ad082f2c /system/lib/libdvm.so 05-31 18:28:12.891: INFO/DEBUG(49): 46937c7c ad06826c /system/lib/libdvm.so 05-31 18:28:12.891: INFO/DEBUG(49): 46937c80 00140450 [heap] 05-31 18:28:12.891: INFO/DEBUG(49): 46937c84 00000000 05-31 18:28:12.891: INFO/DEBUG(49): 46937c88 000002f8 05-31 18:28:12.891: INFO/DEBUG(49): 46937c8c 400091e8 /dev/ashmem/mspace/dalvik-heap/zygote/0 (deleted) 05-31 18:28:12.891: INFO/DEBUG(49): 46937c90 ad081f50 /system/lib/libdvm.so 05-31 18:28:12.891: INFO/DEBUG(49): 46937c94 000002f8 05-31 18:28:12.891: INFO/DEBUG(49): 46937c98 00002710 05-31 18:28:12.891: INFO/DEBUG(49): 46937c9c ad0162e8 /system/lib/libdvm.so Thanks & Regards,

    Read the article

  • Using Repository and Unit of Work patterns with Entity Framework 4.0 and MVC 2

    - by Mr. D
    Hi, I'm following this article Using Repository and Unit of Work patterns with Entity Framework 4.0. I'm tying to implement the Repository and Unit of work pattern, using Asp.Net MVC 2 and Entity Framework 4. Please let me know if I'm doing it right... In the Models folder: Northwind.edmx Products.cs (POCO class) ProductRepository.cs (Did my product query) IProductRepository.cs NorthwindContext.cs IUnitOfWork.cs In the Controller folder: ProductController.cs (Retrieve from ProductRepository.cs and Pass it to the view) When I run the application, I'm getting error message: Mapping and metadata information could not be found for EntityType 'NorthwindMvcPoco.Models.Category'. I don't know what I'm doing wrong. I search through whole web and I couldn't resolve this issue. Please help me.

    Read the article

  • Encrypting using RSA via COM Interop = "The requested operation requires delegation to be enabled on

    - by Mr AH
    Hi Guys, So i've got this little static method in a .Net class which takes a string, uses some stored public key and returns the encrypted version of that key. This is basically so some user entered data can be saved an encrypted, then retrieved and decrypted at a later date. Pretty basic stuff and the unit test works fine. However, part of the application is in classic ASP. This then uses some COM visible version of the class to go off and invoke the method on the real class and return the same string to the COM client (classic ASP). I use this kind of stuff all the time, but in this case we have a major problem. As the method is doing something with RSA keys and has to access certain machine information to do so, we get the error: "The requested operation requires delegation to be enabled on the machine. I've searched around a lot, but can't really understand what this means. I assume I am getting this error on the COM but not the UT because the UT runs as me (Administrator) and classic ASP as IWAM. Anyone know what I need to do to enable IWAM to do this? Or indeed if this is the real problem here?

    Read the article

  • append set to another set

    - by mr.bio
    Hi Is there a better way of appending a set to another set than iterating through each element ? i have : set<string> foo ; set<string> bar ; ..... for (set<string>::const_iterator p = foo.begin( );p != foo.end( ); ++p) bar.insert(*p); Is there a more efficient way to do this ?

    Read the article

  • nHibernate, Automapping and Chained Abstract Classes

    - by Mr Snuffle
    I'm having some trouble using nHibernate, automapping and a class structure using multiple chains of abstract classes It's something akin to this public abstract class AbstractClassA {} public abstract class AbstractClassB : AbstractClassA {} public class ClassA : AbstractClassB {} When I attempt to build these mappings, I receive the following error "FluentNHibernate.Cfg.FluentConfigurationException was unhandled Message: An invalid or incomplete configuration was used while creating a SessionFactory. Check PotentialReasons collection, and InnerException for more detail. Database was not configured through Database method." However, if I remove the abstract keyword from AbstractClassB, everything works fine. The problem only occurs when I have more than one abstract class in the class hierarchy. I've manually configured the automapping to include both AbstractClassA and AbstractClassB using the following binding class public class BindItemBases : IManualBinding { public void Bind(FluentNHibernate.Automapping.AutoPersistenceModel model) { model.IncludeBase<AbstractClassA>(); model.IncludeBase<AbstractClassB>(); } } I've had to do a bit of hackery to get around this, but there must be a better way to get this working. Surely nHibernate supports something like this, I just haven't figured out how to configure it right. Cheers, James

    Read the article

  • How to add a default value to a custom ASP.NET Profile property

    - by mr.moses
    I know you can add defaultValues using the web.config like this: <profile> <properties> <add name="AreCool" type="System.Boolean" defaultValue="False" /> </properties> </profile> but I have the Profile inherited from a class: <profile inherits="CustomProfile" defaultProvider="CustomProfileProvider" enabled="true"> <providers> <clear /> <add name="CustomProfileProvider" type="CustomProfileProvider" /> </providers> </profile> Heres the class: Public Class CustomProfile Inherits ProfileBase Public Property AreCool() As Boolean Get Return Me.GetPropertyValue("AreCool") End Get Set(ByVal value As Boolean) Me.SetPropertyValue("AreCool", value) End Set End Property End Class I don't know how to set the default value of the property. Its causing errors because without a default value, it uses an empty string, which cannot be converted to a Boolean. I tried adding <DefaultSettingValue("False")> _ but that didn't seem to make a difference. I'm also using a custom ProfileProvider (CustomProfileProvider).

    Read the article

  • MSBuild 2010 - how to publish web app to a specific location (nant)?

    - by Mr. Flibble
    I'm trying to get MSBuild 2010 to publish a web app to a specific location. I can get it to publish the deployment package to a particular path, but the deployment package then adds it's own path that changes. For example: if I tell it to publish to C:\dev\build\Output\Debug then the actual web files end up at C:\dev\build\Output\Debug\Archive\Content\C_C\code\sawadee\frontend\IPP-FrontEnd\Source\ControllersViews\obj\Debug\Package\PackageTmp And the C_C part of the path changes (not sure how it chooses this part of the path). This means I can't just script a copy from the publish location. I'm using this nant/msbuild command at the moment: <target name="compile" description="Compiles"> <msbuild project="${name}.sln"> <property name="Platform" value="Any CPU"/> <property name="Configuration" value="Debug"/> <property name="DeployOnBuild" value="true"/> <property name="DeployTarget" value="Package"/> <property name="PackageLocation" value="C:\dev\build\Output\Debug\"/> <property name="AutoParameterizationWebConfigConnectionStrings" value="false"/> <property name="PackageAsSingleFile" value="false"/> </msbuild> Any ideas on how to get it to send the web files directly to a specific location?

    Read the article

  • Reliable way of generating unique hardware ID

    - by mr.b
    Question: what's the best way to accomplish following. I have to come up with unique ID for each networked client, such that: it (ID) should persist once client software is installed on target computer, and should continue to persist if software is re-installed on same computer and same OS installment, it should not change if hardware configuration is modified in most ways (except changing the motherboard) When hard drive with client software installed is cloned to another computer with identical hardware configuration (or, as similar as possible), client software should be aware of that change. A little bit of explanation and some back-story: This question is basically age old question that also touches topic of software copy-protection, as some of mechanisms used in that area are mentioned here. I should be clear at this point that I'm not looking for a copy-protection scheme. Please, read on. :) I'm working on a client-server software that is supposed to work in local network. One of problems I have to solve is to identify each unique client in network (not so much of a problem), so that I can apply certain attributes to every specific client, retain and enforce those attributes during deployment lifetime of a specific client. While I was looking for a solution, I was aware of following: Windows activation system uses some kind of heavy fingerprinting mechanism, that is extremely sensitive to hardware modifications, Disk imaging software copies along all Volume IDs (tied to each partition when formatted), and custom, uniquely generated IDs during installation process, during first run, or in any other way, that is strictly software in its nature, and stored in registry or on hard drive, so it's very easy to confuse two Obvious choice for this kind of problem would be to find out BIOS identifiers (not 100% sure if this is unique through identical motherboard models, though), as that's the only thing I can rely on, that isn't duplicated, transferred by cloning, and that can't be changed (at least not by using some user-space program). Everything else fails as either being not reliable (MAC cloning, anyone?), or too demanding (in terms that it's too sensitive to configuration changes). Am I missing something obvious here? Sub-question that I'd like to ask is, am I doing it correctly, architecture-wise? Perhaps there is a better tool for task that I have to accomplish... Another approach I had in mind is something similar to handshake mechanism, where server maintains internal lookup table of connected client IDs (which can be even completely software-based and non-unique at any given moment), and tells client to come up with different ID during handshake, if duplicate ID is provided upon connection. That approach, unfortunately, doesn't play nicely with one of requirements to tie attributes to specific client during lifetime.

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >