Search Results

Search found 45471 results on 1819 pages for 'microsoft project 2007'.

Page 431/1819 | < Previous Page | 427 428 429 430 431 432 433 434 435 436 437 438  | Next Page >

  • Autounattend.xml not being recognized in VirtualBox

    - by beagle
    I am working my way through the steps on this page to prepare an unattended installation of Windows 7 Enterprise x64 for purposes of a college assignment which simply requires the process to be carried out and documented. Both the "technician" and "reference" computers are virtual machines created in VirtualBox 4.3.12, as will be the destination computer. I seem to have successfully completed Step 1, building an Autounattend.xml answer file using Windows System Image Manager, in as far as the answer file validates successfully. The problem arises when I try to install Windows on the reference machine from the DVD image in conjunction with the Autounattend file on a USB drive. I have tried a couple of different USB devices, and the devices themselves seem to be recognized, but the answer file does not, as instead of taking the configuration settings from the file the user interface appears as in a manual installation. Has anyone come across this problem or a solution? The xml created by Windows SIM is below for reference in case the problem is with the file itself. <?xml version="1.0" encoding="utf-8"?> <unattend xmlns="urn:schemas-microsoft-com:unattend"> <settings pass="oobeSystem"> <component name="Microsoft-Windows-Deployment" processorArchitecture="amd64" publicKeyToken="31bf3856ad364e35" language="neutral" versionScope="nonSxS" xmlns:wcm="http://schemas.microsoft.com/WMIConfig/2002/State" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <Reseal> <Mode>Audit</Mode> </Reseal> </component> <component name="Microsoft-Windows-Shell-Setup" processorArchitecture="amd64" publicKeyToken="31bf3856ad364e35" language="neutral" versionScope="nonSxS" xmlns:wcm="http://schemas.microsoft.com/WMIConfig/2002/State" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <OOBE> <HideEULAPage>true</HideEULAPage> <ProtectYourPC>3</ProtectYourPC> </OOBE> </component> </settings> <settings pass="windowsPE"> <component name="Microsoft-Windows-International-Core-WinPE" processorArchitecture="amd64" publicKeyToken="31bf3856ad364e35" language="neutral" versionScope="nonSxS" xmlns:wcm="http://schemas.microsoft.com/WMIConfig/2002/State" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <SetupUILanguage> <UILanguage>en-IE</UILanguage> </SetupUILanguage> <InputLocale>en-IE</InputLocale> <SystemLocale>en-IE</SystemLocale> <UILanguage>en-IE</UILanguage> <UserLocale>en-IE</UserLocale> </component> <component name="Microsoft-Windows-Setup" processorArchitecture="amd64" publicKeyToken="31bf3856ad364e35" language="neutral" versionScope="nonSxS" xmlns:wcm="http://schemas.microsoft.com/WMIConfig/2002/State" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <DiskConfiguration> <Disk wcm:action="add"> <CreatePartitions> <CreatePartition wcm:action="add"> <Order>1</Order> <Size>300</Size> <Type>Primary</Type> </CreatePartition> <CreatePartition wcm:action="add"> <Order>2</Order> <Extend>true</Extend> <Type>Primary</Type> </CreatePartition> </CreatePartitions> <ModifyPartitions> <ModifyPartition wcm:action="add"> <Active>true</Active> <Format>NTFS</Format> <Label>System</Label> <Order>1</Order> <PartitionID>1</PartitionID> </ModifyPartition> <ModifyPartition wcm:action="add"> <Format>NTFS</Format> <Label>Windows</Label> <Order>2</Order> <PartitionID>2</PartitionID> </ModifyPartition> </ModifyPartitions> <DiskID>0</DiskID> <WillWipeDisk>true</WillWipeDisk> </Disk> <WillShowUI>OnError</WillShowUI> </DiskConfiguration> <ImageInstall> <OSImage> <InstallTo> <DiskID>0</DiskID> <PartitionID>2</PartitionID> </InstallTo> <InstallToAvailablePartition>false</InstallToAvailablePartition> <WillShowUI>OnError</WillShowUI> </OSImage> </ImageInstall> <UserData> <ProductKey> <WillShowUI>OnError</WillShowUI> </ProductKey> <AcceptEula>true</AcceptEula> </UserData> </component> </settings> <settings pass="specialize"> <component name="Microsoft-Windows-IE-InternetExplorer" processorArchitecture="amd64" publicKeyToken="31bf3856ad364e35" language="neutral" versionScope="nonSxS" xmlns:wcm="http://schemas.microsoft.com/WMIConfig/2002/State" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <Home_Page>http://www.example.com</Home_Page> </component> </settings> <cpi:offlineImage cpi:source="wim://technician/users/user/desktop/install.wim#Windows 7 ENTERPRISE" xmlns:cpi="urn:schemas-microsoft-com:cpi" />

    Read the article

  • New-ManagedContentSettings - not working properly under Exchange 2010

    - by mfinni
    I have a client that is divesting a business unit into a new AD forest, Exchange org, etc. We're using Quest tools to migrate users and mailboxes. However, I have to build the new infrastructure to match the old one. In the old one, we're using Managed Folder Mailbox Policies to limit (or allow) retention. They started with Exchange 2007 and never upgraded to Retention Policies; oh well. So, in the old environment, when you use a 2007 server to define a new Managed Content Setting, you can pick "Email" from the dropdown for MessageClass. This is a display name; the actual MessageClass values are thus: MessageClass : IPM.Note;IPM.Note.AS/400 Move Notification Form v1.0;IPM.Note.Delayed;IPM.Note.Exchange.ActiveSync.Report;IPM.Note.JournalReport.Msg;IPM.Note.JournalReport.Tnef;IPM.Note.Microsoft.Missed.Voice;IPM.Note.Rules.OofTemplate.Microsoft;IPM.Note.Rules.ReplyTemplate.Microsoft;IPM.Note.Secure.Sign;IPM.Note.SMIME;IPM.Note.SMIME.MultipartSigned;IPM.Note.StorageQuotaWarning;IPM.Note.StorageQuotaWarning.Warning;IPM.Notification.Meeting.Forward;IPM.Outlook.Recall;IPM.Recall.Report.Success;IPM.Schedule.Meeting.*;REPORT.IPM.Note.NDR If I take that and try to mangle it into a new cmdlet for Ex2010 in my new environment here's what I get New-ManagedContentSettings -Name "Delete Messages older then 90 days" -FolderName "Entire Mailbox" -RetentionEnabled $True -AgeLimitForRetention 90 -TriggerForRetention WhenDelivered -RetentionAction DeleteAndAllowRecovery -MessageClass "IPM.Note","IPM.Note.AS/400MoveNotificationFormv1.0","IPM.Note.Delayed","IPM.Note.Exchange.ActiveSync.Report","IPM.Note.JournalReport.Msg","IPM.Note.JournalReport.Tnef","IPM.Note.Microsoft.Missed.Voice","IPM.Note.Rules.OofTemplate.Microsoft","IPM.Note.Rules.ReplyTemplate.Microsoft","IPM.Note.Secure.Sign","IPM.Note.SMIME","IPM.Note.SMIME.MultipartSigned","IPM.Note.StorageQuotaWarning","IPM.Note.StorageQuotaWarning.Warning","IPM.Notification.Meeting.Forward","IPM.Outlook.Recall","IPM.Recall.Report.Success","IPM.Schedule.Meeting.*","REPORT.IPM.Note.NDR" -whatif Invoke-Command : Cannot bind parameter 'MessageClass' to the target. Exception setting "MessageClass": "The length of t he property is too long. The maximum length is 255 and the length of the value provided is 518." At C:\Users\MFinnigan.sa\AppData\Roaming\Microsoft\Exchange\RemotePowerShell\pfexcas02.fve.ad.5ssl.com\pfexcas02.fve.ad .5ssl.com.psm1:28204 char:29 + $scriptCmd = { & <<<< $script:InvokeCommand ` + CategoryInfo : WriteError: (:) [New-ManagedContentSettings], ParameterBindingException + FullyQualifiedErrorId : ParameterBindingFailed,Microsoft.Exchange.Management.SystemConfigurationTasks.NewManaged ContentSettings So, the config object can store all that mess, but I can't fit it in through the cmdlet to create the object. Lovely. Any ideas?

    Read the article

  • Multiple versions of .NET

    - by grawity
    In my WinXP box, I have these "programs" installed: Microsoft .NET Framework 1.1 Microsoft .NET Framework 2.0 SP2 Microsoft .NET Framework 3.0 SP2 Microsoft .NET Framework 3.5 SP1 Do I need all four versions? Can software compiled on .NET 1.1 run on a 3.5 runtime?

    Read the article

  • Importing Analysis Services 2008 KPI's in a PerformancePoint scorecard

    - by Colin
    I am trying to import a KPI from Analysis Services into a PerformancePoint Scorecard, and when I do, The Dashboard Designer throws an error: An unknown error has occurred. If the problem persists contact an administrator. There may be additional information in the server application event log. When I examine the event log, I find the following exception: System.IO.FileNotFoundException: Could not load file or assembly 'Microsoft.AnalysisServices, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified. File name: 'Microsoft.AnalysisServices, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' at Microsoft.PerformancePoint.Scorecards.Server.ImportExportHelper.GetImportableAsKpis(IBpm pmService, DataSource asDataSource) at Microsoft.PerformancePoint.Scorecards.Server.PmServer.GetImportableAsKpis(DataSource dataSource) I have found this thread which recommends reinstalling Microsoft ADOMD.NET but the installer for that won't run because the server already has a newer version of the product (The server is running SQL Server Analysis Services 2008 which includes Microsoft.AnalysisServices.AdomdClient.dll version 9.0.3042.0) Anyone have any ideas (short of finding the DLL myself and manually installing it to the GAC)?

    Read the article

  • Multiple Silverlight Unit Test Projects in Solution

    - by IUnknown
    I am building out a number of Silverlight 4.0 libraries that are part of the same solution. I like to break them into separate projects and have a Unit Test project for each: SolutionX -LibraryProject1 ---Class1.cs ---Class2.cs -LibraryProject1.Test ---Tests1.cs ---Tests2.cs -LibraryProject2 ---Class1.cs ---Class2.cs ---CLass3.cs -LibraryProject2.Test ---Tests1.cs ---Tests2.cs ---Tests3.cs -LibraryProject3 ---Class1.cs -LibraryProject3.Test ---Tests1.cs This works great when using VS regular test projects and infrastructure because I can create and execute list of test that are aggregated from each Test project. But with the Silverlight Unit Test Framework since the Silverlight Unit Test Project must be the "start up project" I cannot figure how to run a collection of tests from each test project in one go. I have to run each separately then switch the starting project each time. I would prefer to avoid create complex build scripts or build definitions - is there a way to run all the tests at once? -Thanks

    Read the article

  • Winforms TabControl causing spurious Paint events for UserControl

    - by Tom Bushell
    For our project, we've written a WinForms UserControl for graphing. We're seeing some strange behavior when our control is sited in a TabControl - our control continuously fires Paint events, even when there is absolutely no activity by the user. We only see this in the TabControl. When we site our control in other containers such as Forms or Splitters, Paint is only fired when you'd expect e.g. when the control is first displayed, etc. Can anyone suggest why this might be happening? Here's a stack trace from a breakpoint in our control's Paint handler, if that's any help. OverlordFrontEnd.exe!OverlordFrontEnd.MainForm.graphControl_Paint(object sender = BI_BaseGraphXY.BaseGraphXY}, System.Windows.Forms.PaintEventArgs e = {ClipRectangle = {X=0,Y=0,Width=1031,Height=408}}) Line 422 C# System.Windows.Forms.dll!System.Windows.Forms.Control.OnPaint(System.Windows.Forms.PaintEventArgs e) + 0x73 bytes BI_AppCore.dll!BI_BaseGraphXY.BaseGraphXY.OnPaint(System.Windows.Forms.PaintEventArgs e = {ClipRectangle = {X=0,Y=0,Width=1031,Height=408}}) Line 377 + 0xb bytes C# System.Windows.Forms.dll!System.Windows.Forms.Control.PaintTransparentBackground(System.Windows.Forms.PaintEventArgs e, System.Drawing.Rectangle rectangle, System.Drawing.Region transparentRegion = null) + 0x16c bytes System.Windows.Forms.dll!System.Windows.Forms.Control.PaintBackground(System.Windows.Forms.PaintEventArgs e = {ClipRectangle = {X=0,Y=0,Width=1029,Height=406}}, System.Drawing.Rectangle rectangle, System.Drawing.Color backColor, System.Drawing.Point scrollOffset) + 0xbc bytes System.Windows.Forms.dll!System.Windows.Forms.Control.PaintBackground(System.Windows.Forms.PaintEventArgs e, System.Drawing.Rectangle rectangle) + 0x63 bytes System.Windows.Forms.dll!System.Windows.Forms.Control.OnPaintBackground(System.Windows.Forms.PaintEventArgs pevent) + 0x59 bytes System.Windows.Forms.dll!System.Windows.Forms.Control.PaintWithErrorHandling(System.Windows.Forms.PaintEventArgs e = {ClipRectangle = {X=0,Y=0,Width=1029,Height=406}}, short layer, bool disposeEventArgs = false) + 0x74 bytes System.Windows.Forms.dll!System.Windows.Forms.Control.WmPaint(ref System.Windows.Forms.Message m) + 0x1ba bytes System.Windows.Forms.dll!System.Windows.Forms.Control.WndProc(ref System.Windows.Forms.Message m) + 0x33e bytes System.Windows.Forms.dll!System.Windows.Forms.Control.ControlNativeWindow.OnMessage(ref System.Windows.Forms.Message m) + 0x10 bytes System.Windows.Forms.dll!System.Windows.Forms.Control.ControlNativeWindow.WndProc(ref System.Windows.Forms.Message m) + 0x31 bytes System.Windows.Forms.dll!System.Windows.Forms.NativeWindow.Callback(System.IntPtr hWnd, int msg = 15, System.IntPtr wparam, System.IntPtr lparam) + 0x5a bytes [Native to Managed Transition] [Managed to Native Transition] System.Windows.Forms.dll!System.Windows.Forms.Application.ComponentManager.System.Windows.Forms.UnsafeNativeMethods.IMsoComponentManager.FPushMessageLoop(int dwComponentID, int reason = -1, int pvLoopData = 0) + 0x24e bytes System.Windows.Forms.dll!System.Windows.Forms.Application.ThreadContext.RunMessageLoopInner(int reason = -1, System.Windows.Forms.ApplicationContext context = {Microsoft.VisualBasic.ApplicationServices.WindowsFormsApplicationBase.WinFormsAppContext}) + 0x177 bytes System.Windows.Forms.dll!System.Windows.Forms.Application.ThreadContext.RunMessageLoop(int reason, System.Windows.Forms.ApplicationContext context) + 0x61 bytes System.Windows.Forms.dll!System.Windows.Forms.Application.Run(System.Windows.Forms.ApplicationContext context) + 0x18 bytes Microsoft.VisualBasic.dll!Microsoft.VisualBasic.ApplicationServices.WindowsFormsApplicationBase.OnRun() + 0x81 bytes Microsoft.VisualBasic.dll!Microsoft.VisualBasic.ApplicationServices.WindowsFormsApplicationBase.DoApplicationModel() + 0xef bytes Microsoft.VisualBasic.dll!Microsoft.VisualBasic.ApplicationServices.WindowsFormsApplicationBase.Run(string[] commandLine) + 0x2c0 bytes OverlordFrontEnd.exe!OverlordFrontEnd.Program.Main() Line 36 + 0x10 bytes C# [Native to Managed Transition] [Managed to Native Transition] mscorlib.dll!System.AppDomain.ExecuteAssembly(string assemblyFile, System.Security.Policy.Evidence assemblySecurity, string[] args) + 0x3a bytes Microsoft.VisualStudio.HostingProcess.Utilities.dll!Microsoft.VisualStudio.HostingProcess.HostProc.RunUsersAssembly() + 0x2b bytes mscorlib.dll!System.Threading.ThreadHelper.ThreadStart_Context(object state) + 0x66 bytes mscorlib.dll!System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext executionContext, System.Threading.ContextCallback callback, object state) + 0x6f bytes mscorlib.dll!System.Threading.ThreadHelper.ThreadStart() + 0x44 bytes

    Read the article

  • Why is ListBoxFor not selecting items, but ListBox is?

    - by Roger Rogers
    I have the following code in my view: <%= Html.ListBoxFor(c => c.Project.Categories, new MultiSelectList(Model.Categories, "Id", "Name", new List<int> { 1, 2 }))%> <%= Html.ListBox("MultiSelectList", new MultiSelectList(Model.Categories, "Id", "Name", new List<int> { 1, 2 }))%> The only difference is that the first helper is strongly typed (ListBoxFor), and it fails to show the selected items (1,2), even though the items appear in the list, etc. The simpler ListBox is working as expected. I'm obviously missing something here. I can use the second approach, but this is really bugging me and I'd like to figure it out. For reference, my model is: public class ProjectEditModel { public Project Project { get; set; } public IEnumerable<Project> Projects { get; set; } public IEnumerable<Client> Clients { get; set; } public IEnumerable<Category> Categories { get; set; } public IEnumerable<Tag> Tags { get; set; } public ProjectSlide SelectedSlide { get; set; } } Update I just changed the ListBox name to Project.Categories (matching my model) and it now FAILS to select the item. <%= Html.ListBox("Project.Categories", new MultiSelectList(Model.Categories, "Id", "Name", new List<int> { 1, 2 }))%> I'm obviously not understanding the magic that is happening here. Update 2 Ok, this is purely naming, for example, this works... <%= Html.ListBox("Project_Tags", new MultiSelectList(Model.Tags, "Id", "Name", Model.Project.Tags.Select(t => t.Id)))%> ...because the field name is Project_Tags, not Project.Tags, in fact, anything other than Tags or Project.Tags will work. I don't get why this would cause a problem (other than that it matches the entity name), and I'm not good enough at this to be able to dig in and find out.

    Read the article

  • Run MySQL INSERT Query multiple times (insert values into multiple tables)

    - by Derek
    Hi, basically, I have 3 tables; users and projects (which is a many-to-many relationship), then I have 'usersprojects' to allow the one-to-many formation. When a user adds a project, I need the project information stored and then the 'userid' and 'projectid' stored in the usersprojects table. It sounds like its really straight forward but I'm having problems with the syntax I think!? As it stands, I have this as my INSERT queries (values going into 2 different tables): $project_id = $_POST['project_id']; $projectname = $_POST['projectname']; $projectdeadline = $_POST['projectdeadline']; $projectdetails = $_POST['projectdetails']; $user_id = $_POST['user_id']; $sql = "INSERT INTO projects (projectid, projectname, projectdeadline, projectdetails) VALUES ('{$projectid}','{$projectname}','{$projectdeadline}','{$projectdetails}')"; $sql = "INSERT INTO usersprojects (userid, projectid) VALUES ('{$userid}','{$projectid}')"; None of the information is being stored in the projects table, but the user ID is being stored in the usersprojects table (but not project ID!?)... I did have it working where the project information is stored correctly with a project ID, before I added this bit: $sql = "INSERT INTO usersprojects (userid, projectid) VALUES ('{$userid}','{$projectid}')"; But before the code above was put in, obviously no info is being stored in usersprojects table. The source code that links the script: <form id="addform" name="addform" method="POST" action="addproject-run.php"> <label>Project Name:</label> <input name="projectname" size="40" id="projectname" value="<?php if (isset($_POST['projectname'])); ?>"/><br /> <input name="user_id" input type="hidden" size="40" id="user_id" value="<?php echo $_SESSION['SESS_USERID']; ?>"/> <label>Project Deadline:</label> <input name="projectdeadline" size="40" id="projectdeadline" value="In the format of 'YYYY-MM-DD'<?php if (isset($_POST['projectdeadline'])); ?>"/><br /> <label>Project Details:</label> <textarea rows="5" cols="20" name="projectdetails" id="projectdetails"><?php if (isset($_POST['projectdetails'])); ?></textarea> <br /> <br /> <input value="Create Project" class="addbtn" type="submit" /> </form></div> So I think I'm right in saying I have the syntax for the SQL statement to be run an insert query of values into 2 tables? Any help is much appreciated! Thanks.

    Read the article

  • SharePoint Saturday Michigan 2010 Recap, Slides, and Photos

    - by Brian Jackett
    This past weekend I attended SharePoint Saturday Michigan (SPSMI) in Ann Arbor, Michigan.  For those unfamiliar, SharePoint Saturday is a community driven event where various speakers gather to present at a FREE conference on all topics related to SharePoint.  This made my third SharePoint Saturday attended and second I’ve spoken at.  I believe today it was announced that about 210 people total attended the event.  I was very happy with the turnout, especially the ratio of male to female attendees.  Typically with computer related conferences the ratio leans towards more males attending, but both Peter Serzo (one of conference organizers) and I both commented to each other that at the end of the day it appeared to be close to 40% women in the crowd.  So here’s my recap of the weekend. Arrival     Friday afternoon I drove up from Columbus, OH to Ann Arbor, MI and arrived around 4pm.  I was attempting to avoid the rush hour traffic and construction backups.  Turned out to be a good idea because other speakers coming up Friday got stuck on a highway which literally closed down in both directions due to a bad accident.  I was talking my friend Sean McDonough through the highway closing and this was the first time I had seen a solid black traffic line on Google Maps.  Most of us are familiar with Green, Yellow, and Red, but this line was black if that tells you how bad it got. Speaker “Dinner”     Fast forward a few hours and it was time for the speaker “dinner.”  I put “dinner” in quotes because with this night alone SPSMI set a new bar for nicest and most extravagant speaker appreciation events for SharePoint Saturday.  By tapping into some very influential contacts, the conference organizers were able to provide a truck limo (yep you heard right) with refreshments, access to an underground suite at the Palace of Auburn Hills, and courtside tickets to see the Detroit Pistons play that night.  Being a Michigan native I have to say that I was absolutely floored by this experience and very thankful to our conference organizers Peter, Sebastian, and Jesse along with Trillium Teamologies. Sessions     The actual conference started Saturday morning at 9am with the keynote by Rob Collie who is the Microsoft program manager for PowerPivot.  The day continued and I attended the following sessions: Mike Watson (@mikewat) – “SharePoint 2010 Fight Night: Devs vs. Admins” Karl Swedeberg (@kswedberg) – “A Walk on the Client Side with jQuery“ [my session] Brian Jackett (@briantjackett) - “Real World Deployment of SharePoint 2007 Solutions” Jeff Willinger (@jwillie) - “Social Computing and Collaboration Inside and Outside the 4 Walls” Paul Schaeflein (@paulschaeflein) – “PowerShell for the SharePoint Developer” My Presentation     I had a great time presenting my session on Deploying SharePoint 2007 Solutions, but it wasn’t without its fair share of technical issues.  As my session was right after lunch I came in to my room 10 mins early to set up my laptop, slides, and demos.  As a quick background note, a few months ago I got an upgraded laptop from my company Sogeti and have been dual booting it between XP (factory installed) and Windows Server 2008 R2 w/ Hyper-V.  As such I had prepared all of my demo virtual machines to run under Hyper-V.  About 3 minutes before my session was scheduled to start though it became apparent that I did not have the correct display drivers to connect Windows Server 2008 R2 to the projector…     As you can imagine this was a slight cause for concern as I was potentially going to be unable to give my presentation.  Luckily for me I usually prepare for such unforeseen issues and had my presentation and some spare VMs that would run on XP on my external hard drive.  Knowing this I rebooted my machine into XP and began my presentation without slides until about 5 mins into the session when everything was up and running on XP.  Despite this being the first time I gave this presentation I have to say it was one of my favorites I’ve given so far.  The audience was very engaged in the session and I received some great, positive feedback afterwards.  Thanks to all who attended my session, I appreciate it very much. Link to Presentation Files     For those of you who attended my session and would like my slides or demo PowerShell scripts they can be found on my SkyDrive at the link below.  Also, if you have a few minutes and wouldn’t mind rating my session I have this session posted on SpeakerRate.  As speakers we always appreciate any and all feedback attendees offer, so thank you if you are able to provide any. SkyDrive folder with session files Rate my SharePoint 2007 Solutions session   Picture Albums     For everyone else, here are my pictures from the weekend.  The first link is to my FaceBook album which will have tagging (recommend this one.)  The second is to my Live album if you care for higher resolution images. http://www.facebook.com/album.php?aid=2154482&id=21905041&l=a3fb72ee8c View Full Album Conclusion     A big thank you goes out to all of the organizers, speakers, sponsors, and attendees of SPSMI.  As I’ve said so many times, without each and every one of you these events wouldn’t be possible.  I thoroughly enjoyed this trip back to my home state and presenting a new session.  For those interested in my upcoming schedule I will be giving two sessions on PowerShell at SharePoint Saturday Charlotte in April, helping plan Stir Trek: Iron Man Edition in May, and I’m submitting sessions to Day of .Net Ann Arbor in May as well.  Beyond that I haven’t planned out any travels.  Thanks for reading my recap.  Look forward to more technical posts now that I have a short break in conferences.         -Frog Out   links: Michigan image

    Read the article

  • Is gchart safe to use?

    - by Paul Tomblin
    The home page for gchart, a client side charting add-in for Google Web Toolkit (GWT), has a long screed about how the project's only maintainer thinks his Google account has been hacked and because of that he will be "disavowing/abandoning my own project and Google account". Does that mean the project is an orphan? Is somebody taking it over? There is always a risk on basing your project on somebody else's code because they may stop supporting it or abandon it during your project's life time, but it seems to me that with the fast evolution of Java and GWT, using gchart in a new project may be a big mistake. Am I right?

    Read the article

  • Do You Know How OUM defines the four, basic types of business system testing performed on a project? Why not test your knowledge?

    - by user713452
    Testing is perhaps the most important process in the Oracle® Unified Method (OUM). That makes it all the more important for practitioners to have a common understanding of the various types of functional testing referenced in the method, and to use the proper terminology when communicating with each other about testing activities. OUM identifies four basic types of functional testing, which is sometimes referred to as business system testing.  The basic functional testing types referenced by OUM include: Unit Testing Integration Testing System Testing, and  Systems Integration Testing See if you can match the following definitions with the appropriate type above? A.  This type of functional testing is focused on verifying that interfaces/integration between the system being implemented (i.e. System under Discussion (SuD)) and external systems functions as expected. B.     This type of functional testing is performed for custom software components only, is typically performed by the developer of the custom software, and is focused on verifying that the several custom components developed to satisfy a given requirement (e.g. screen, program, report, etc.) interact with one another as designed. C.  This type of functional testing is focused on verifying that the functionality within the system being implemented (i.e. System under Discussion (SuD)), functions as expected.  This includes out-of-the -box functionality delivered with Commercial Off-The-Shelf (COTS) applications, as well as, any custom components developed to address gaps in functionality.  D.  This type of functional testing is performed for custom software components only, is typically performed by the developer of the custom software, and is focused on verifying that the individual custom components developed to satisfy a given requirement  (e.g. screen, program, report, etc.) functions as designed.   Check your answers below: (D) (B) (C) (A) If you matched all of the functional testing types to their definitions correctly, then congratulations!  If not, you can find more information in the Testing Process Overview and Testing Task Overviews in the OUM Method Pack.

    Read the article

  • Generate Ant build file

    - by inakiabt
    I have the following project structure: root/ comp/ env/ version/ build.xml build.xml build.xml Where root/comp/env/version/build.xml is: <project name="comp-env-version" basedir="."> <import file="../build.xml" optional="true" /> <echo>Comp Env Version tasks</echo> <target name="run"> <echo>Comp Env Version run task</echo> </target> </project> root/comp/env/build.xml is: <project name="comp-env" basedir="."> <import file="../build.xml" optional="true" /> <echo>Comp Env tasks</echo> <target name="run"> <echo>Comp Env run task</echo> </target> </project> root/comp/build.xml is: <project name="comp" basedir="."> <echo>Comp tasks</echo> </project> Each build file imports the parent build file and each child inherits and overrides parent tasks/properties. What I need is to get the generated build XML without run anything. For example, if I run "ant" (or something like that) on root/comp/env/version/, I would like to get the following output: <project name="comp-env-version" basedir="."> <echo>Comp tasks</echo> <echo>Comp Env tasks</echo> <echo>Comp Env Version tasks</echo> <target name="run"> <echo>Comp Env Version run task</echo> </target> </project> Is there an Ant plugin to do this? With Maven? What are my options if not? EDIT: I need something like "mvn help:effective-pom" for Ant.

    Read the article

  • Benefits of PerformancePoint Services Using SharePoint Server 2010

    - by Wayne
    What is PerformancePoint Services? Most of the time it happens that the metrics that make up your key performance indicators are not simple values from a data source. In SharePoint Server 2007 PerformancePoint Services, you could create two kinds of KPI metrics: Simple single value metrics from any supported data source or Complex multiple value metrics from a single Analysis Services data source using MDX. Now things are even easier with Performance Point Services in SharePoint 2010. Let us check what is it? PerformancePoint Services in SharePoint Server 2010 is a performance management service that you can use to monitor and analyze your business. By providing flexible, easy-to-use tools for building dashboards, scorecards, reports, and key performance indicators (KPIs), PerformancePoint Services can help everyone across an organization make informed business decisions that align with companywide objectives and strategy. Scorecards, dashboards, and KPIs help drive accountability. Integrated analytics help employees move quickly from monitoring information to analyzing it and, when appropriate, sharing it throughout the organization. Prior to the addition of PerformancePoint Services to SharePoint Server, Microsoft Office PerformancePoint Server 2007 functioned as a standalone server. Now PerformancePoint functionality is available as an integrated part of the SharePoint Server Enterprise license, as is the case with Excel Services in Microsoft SharePoint Server 2010. The popular features of earlier versions of PerformancePoint Services are preserved along with numerous enhancements and additional functionality. New PerformancePoint Services features PerformancePoint Services now can utilize SharePoint Server scalability, collaboration, backup and recovery, and disaster recovery capabilities. Dashboards and dashboard items are stored and secured within SharePoint lists and libraries, providing you with a single security and repository framework. New features and enhancements of SharePoint 2010 PerformancePoint Services • With PerformancePoint Services, functioning as a service in SharePoint Server, dashboards and dashboard items are stored and secured within SharePoint lists and libraries, providing you with a single security and repository framework. The new architecture also takes advantage of SharePoint Server scalability, collaboration, backup and recovery, and disaster recovery capabilities. You also can include and link PerformancePoint Services Web Parts with other SharePoint Server Web Parts on the same page. The new architecture also streamlines security models that simplify access to report data. • The Decomposition Tree is a new visualization report type available in PerformancePoint Services. You can use it to quickly and visually break down higher-level data values from a multi-dimensional data set to understand the driving forces behind those values. The Decomposition Tree is available in scorecards and analytic reports and ultimately in dashboards. • You can access more detailed business information with improved scorecards. Scorecards have been enhanced to make it easy for you to drill down and quickly access more detailed information. PerformancePoint scorecards also offer more flexible layout options, dynamic hierarchies, and calculated KPI features. Using this enhanced functionality, you can now create custom metrics that use multiple data sources. You can also sort, filter, and view variances between actual and target values to help you identify concerns or risks. • Better Time Intelligence filtering capabilities that you can use to create and use dynamic time filters that are always up to date. Other improved filters improve the ability for dashboard users to quickly focus in on information that is most relevant. • Ability to include and link PerformancePoint Services Web Parts together with other PerformancePoint Services Web parts on the same page. • Easier to author and publish dashboard items by using Dashboard Designer. • SQL Server Analysis Services 2008 support. • Increased support for accessibility compliance in individual reports and scorecards. • The KPI Details report is a new report type that displays contextually relevant information about KPIs, metrics, rows, columns, and cells within a scorecard. The KPI Details report works as a Web part that links to a scorecard or individual KPI to show relevant metadata to the end user in SharePoint Server. This Web part can be added to PerformancePoint dashboards or any SharePoint Server page. • Create analytics reports to better understand underlying business forces behind the results. Analytic reports have been enhanced to support value filtering, new chart types, and server-based conditional formatting. To conclude, PerformancePoint Services, by becoming tightly integrated with SharePoint Server 2010, takes advantage of many enterprise-level SharePoint Server 2010 features. Unfortunately, SharePoint Foundation 2010 doesn’t include this feature. There are still many choices in SharePoint family of products that include SharePoint Server 2010, SharePoint Foundation, SharePoint Server 2007 and associated free SharePoint web parts and templates.

    Read the article

  • FlexBuilder compiler bug - IWatcherSetupUtil2 et al

    - by Marty Pitt
    I'm having a problem with FlashBuilder in what is clearly a compiler bug, but I can't track it down. When my project is compiled inside FlashBuilder, I'm getting the following compiler errors: Type was not found or was not a compile-time constant: [mx.binding]::IBindingClient Type was not found or was not a compile-time constant: [mx.binding]::IWatcherSetup2 Type was not found or was not a compile-time constant: [mx.core]::IStateClient2 These errors are reported without a path or location. My project is a flex4 project, moderately complex. It has 6 swc projects, which are referenced within a swf project. (The swf project is the one that's reporting the error). The ANT build script compiles the project fine. The problem exists on more than 1 PC. How do I start tracking down what's causing the problem?

    Read the article

  • Integrate flex 3.5 projects in flash builder 4 beta 2

    - by Cyrill Zadra
    Hi I'm currently using Flex Builder 3 and Flex SDK 3.5 for my projects. But I'd like to try out the new Flash Builder 4. So I downloaded and installed the new software, configured all the additional software like subversion, server adapter .. and finally a importet my 2 projects. 1) Main Project (includes a swc generated by the Library Project) (flex sdk 3.5) 2) Library Project (flex sdk 3.4) After the import and project cleanup the project is running perfectly. But as soon as I replace the existing LibraryProject.swc through a new one (compiled with flash builder 4 beta 2 sdk 3.4) VerifyError: Error #1014: class mx.containers::Canvas not found. VerifyError: Error #1014: class mx.containers::HBox not found. VerifyError: Error #1014: class IWatcherSetupUtil not found. ... and several others not found errors. Does anyone has the same error. How can I get my project running again? thanks & regards cyrill

    Read the article

  • Responsible BI for Excel, Even for Older Versions

    - by andrewbrust
    On Wednesday, I will have the honor of co-presenting, for both The Data Warehouse Institute (TDWI) and the New York Technology Council. on the subject of Excel and BI. My co-presenter will be none other than Bill Baker, who was a Microsoft Distinguished Engineer and, essentially, the father of BI at that company.  Details on the events are here and here. We'll be talking about PowerPivot, of course, but that's not all. Probably even more important than any one product, will be our discussion of whether the usual characterization of Excel as the nemesis of IT, the guilty pleasure of business users and the antithesis of formal BI is really valid and/or hopelessly intractable. Without giving away our punchline, I'll tell you that we are much more optimistic than that. There are huge upsides to Excel and while there are real dangers to using it in the BI space, there are standards and practices you can employ to ensure Excel is used responsibly. And when those practices are followed, Excel becomes quite powerful indeed. One of the keys to this is using Excel as a data consumer rather than data storage mechanism. Caching data in Excel is OK, but only if that data is (a) not modified and (b) configured for automated periodic refresh. PowerPivot meets both criteria -- it stores a read-only copy of your data in the form of a model, and once workbook containing a PowerPivot model is published to SharePoint, it can be configured for scheduled data refresh, on the server, requiring no user intervention whatsoever. Data refresh is a bit like hard drive backup: it will only happen reliably if it's automated, and super-easy to configure. PowerPivot hits a real home run here (as does Windows Home Server for PC backup, but I digress). The thing about PowerPivot is that it's an add-in for Excel 2010. What if you're not planning to go to that new version for quite a while? What if you’ve just deployed Office 2007 in your organization? What if you're still on Office 2003, or an even earlier version? What can you do immediately to share data responsibly and easily? As it turns out, there's a feature in Excel that's been around for quite a while, that can help: Web Queries.  The Web Query feature was introduced, ostensibly, to allow Excel to pull data in from Internet Web pages…for example, data in a stock quote history table will come in nicely, as will any data in a Web page that is displayed in an HTML table.  To use the feature In Excel 2007 or 2010, click the Data Tab or the ribbon and click the “From Web” button towards the left; in older versions use the corresponding option in  the menu or  toolbars.  Next, paste a URL into the resulting dialog box and tap Enter or click the Go button.  A preview of the Web page will come up, and the dialog will allow you to select the specific table within the page whose data you’d like to import.  Here’s an example: Now just click the table, click the Import button, and the Import Data dialog appears.  You can simply click OK to bring in your data or you can first click the Properties… button and configure the data import to be refreshed at an interval in minutes that you select.  Now your data’s in the spreadsheet and ready to worked with: Your data may be vulnerable to modification, but if you’ve set up the data refresh, any accidental or malicious changes will be corrected in time anyway. The thing about this feature is that it’s most useful not for public Web pages, but for pages behind the firewall.  In effect, the Web Query feature provides an incredibly easy way to consume data in Excel that’s “published” from an application.  Users just need a URL.  They don’t need to know server and database names and since the data is read-only, providing credentials may be unnecessary, or can be handled using integrated security.  If that’s not good enough, the Web Query can be saved to a special .iqy file, which can be edited to provide POST parameter data. The only requirement is that the data must be provided in an HTML table, with the first row providing the column names.  From an ASP.NET project, it couldn’t be easier: a simple bound GridView control is totally compatible.  Use a data source control with it, and you don’t even have to write any code.  Users can link to pages that are part of an application’s UI, or developers can create pages that are specially designed for the purpose of providing an interface to the Web Query import feature.  And none of this is Microsoft- or .NET-specific.  You can create pages in any language you want (PHP comes to mind) that output the result set of a query in HTML table format, and then consume that data in a Web Query.  Then build PivotTables and charts on the data, and in Excel 2007 or 2010 you can use conditional formatting to create scorecards and dashboards. This strategy allows you to create pages that function quite similarly to the OData XML feeds rendered when .NET developers create an “Astoria” WCF Data Service.  And while it’s cool that PowerPivot and Excel 2010 can import such OData feeds, it’s good to know that older versions of Excel can function in a similar fashion, and can consume data produced by virtually any Web development platform. As a final matter, instead of just telling you that “older versions” of Excel support this feature, I’ll be more specific.  To discover what the first version of Excel was to support Web queries, go to http://bit.ly/OldSchoolXL.

    Read the article

  • making a new opensource WebOs ?

    - by Ayman
    hello guys, trying to make a new webos for my graduation project, I'm a computer science guy, and this project will be my graduation project. two days ago i sat with ghost OS R&D operation manager and he told me it's a big project and i should not thinking in a project like this. anyway i have an experience, and i can made it using GWT and some programing languages for server side. and am gonna make it as a development environment, OS with specific API that allows any body to write an applications, or some modifications to add a 3rd party apps. i need some feedback and what about making it open-source project, what do you think guys ?

    Read the article

  • ClearCase: Versioning at file level

    - by Naveen
    Hi, I have a perculier problem about how to maintain a clearcase project. This project is a xml schema repository where each schema has a version. This repository is common and is used by all the apps in the enterprise. From clearcase prespective the project has a single component. Now the apps can be using different versions of the schema(s). So we are trying to figureout a way to setup the project in such way that a project can have a baseline of what versions of these files are included in a build. The only way we know of how to do this is to create a component for each schema or group of schemas and create a stream for each app to include the components they use. But that would result in too many components. Has anyone dealth with something like this before? We are prepared to restructre the whole project if necessary, so we are open to any idea. Thans for the help.

    Read the article

  • Maven best practice for generating multiple jars with different/filtered classes ?

    - by jaguard
    I developed a Java utility library (similarly to Apache Commons) that I use in various projects. Additionally to fat clients I also use it for mobile clients (PDA with J9 Foundation profile). In time the library that started as a single project spread over multiple packages. As a result I end up with a lot of functionality but not really needed in all the projects. Since this library is also used inside some mobile/PDA projects I need a way to collect just the used classes and generate the actual specialized jars Currently in the projects that area using this library, I have Ant jar tasks that generate (from the utility project) the specialized jar files (ex: my-util-1.0-pda.jar, my-util-1.0-rcp.jar) using include/exclude jar task features. This is mostly needed due to the generated jar size constraints for the mobile projects. Migrating now to Maven I just wonder if there are any best practices to arrive to something similar so I consider the following scenarios: [1] - additionally to the main jar artifact (my-lib-1.0.jar) also generating inside my-lib project the separate/specialized artifacts using classifiers (ex: my-lib-1.0-pda.jar) using Maven Jar Plugin or Maven Assembly Plugin filtering/includes ... I'm not very comfortable with this approach since it pollute the library with library consumers demands (filters) [2] - Create additional Maven projects for all the specialized clients/projects, that will "wrap" the "my-lib" and generate the filtered jar artifacts (ex: my-lib-wrapper-pda-1.0 ...etc). As a result, these wrapper projects will include the filtering (to generate the filtered artifact) and will depend just on the "my-lib" project and the client projects will depend on my-lib-wrapper-xxx-1.0 instead of my-lib-1.0. This approach my look problematic since even that will let "my-lib" project intact (with no additional classifiers & artifacts), basically will double the number of projects since for every client project I'll have one just to collect the needed classes from the "my-util" library ("my-pda-app" project will need a "my-lib-wrapper-for-my-pda-app" project/dependency) [3] - Into the every client project that use the library (ex: my-pda-app) add some specialized Maven plugins to trim - out (when generating the final artifact/package) the un-needed classes (ex: maven-assembly-plugin, maven-jar-plugin, proguard-maven-plugin) What is the best practice for solving this kind of problems in the "Maven way" ?!

    Read the article

  • Borland Starteam incorrect files status

    - by Kate
    I have some project saved in starteam. As there are a lot of obsolete files I can't check in or check out all project, only changed files. Now I copy project from one computer to another for other developer. I expect starteam treats copied project as new item for check in and check out, but it don't. Forexample: I modified file on fist computer. I update list of files on second computer and see this file in "check in" list, as I modified it on second computer. It is inncorrect. I think there is some configuration file or something like, that saves computer (or user) settings. So when project is copied, settings is copied too. Do anybody know how to change this configuration to set copied project as new instance of starteam???

    Read the article

  • Nant xmlpoke and unique nodes

    - by Lou Franco
    I am trying to use an xmlpoke task to update a VS Project File (which is XML). In the Project root, there are multiple PropertyGroup nodes, I am trying to select the first one. The XML looks like this <Project> <PropertyGroup> </PropertyGroup> <PropertyGroup> </PropertyGroup> <PropertyGroup> </PropertyGroup> </Project> I am using an xpath of //Project/PropertyGroup[1] to get the first PropertyGroup, but I get the error: “Non-unique xpath given //Project/PropertyGroup[1]”.

    Read the article

  • compiling opencv 2.4 on a 64 bit mac in Xcode

    - by Walt
    I have an opencv project that I've been developing under ubuntu 12.04, on a parellels VM on a mac which has an x86_64 architecture. There have been many screen switching performance issues that I believe are due to the VM, where linux video modes flip around for a couple seconds while camera access is made by the opencv application. I decided to moved the project into Xcode on the mac side of the computer to continue the opencv development. However, I'm not that familiar with xcode and am having trouble getting the project to build correctly there. I have xcode installed. I downloaded and decompressed the latest version of opencv on the mac, and ran: ~/src/opencv/build/cmake-gui -G Xcode .. per the instructions from willowgarage and various other locations. This appeared to work fine (however I'm wondering now if I'm missing an architecture setting in here, although it is 64-bit intel in Xcode). I then setup an xcode project with the source files from the linux project and changed the include directories to use /opt/local/include/... rather than the /usr/local/include/... I switched xcode to use the LLVM GCC compiler in the build settings for the project then set the Apple LLVM Dialog for C++ to Language Dialect to GNU++11 (which seems possibly inconsistant with the line above) I'm not using a makefile in xcode, (that I'm aware of - it has its own project file...) I was also running into a linker issue that looked like they may be resolved with the addition of this linker flag: -lopencv_video based on a similar posting here: other thread however in that case the person was using a Makefile in their project. I've tried adding this linker flag under "Other Linker Flags" in xcode build settings but still get build errors. I think I may have two issues here, one with the architecture settings when building the opencv libraries with Cmake, and one with the linker flag settings in my project. Currently the build error list looks like this: Undefined symbols for architecture x86_64: "cv::_InputArray::_InputArray(cv::Mat const&)", referenced from: _main in main.o "cv::_OutputArray::_OutputArray(cv::Mat&)", referenced from: _main in main.o "cv::Mat::deallocate()", referenced from: cv::Mat::release() in main.o "cv::Mat::copySize(cv::Mat const&)", referenced from: cv::Mat::Mat(cv::Mat const&)in main.o cv::Mat::operator=(cv::Mat const&)in main.o "cv::Mat::Mat(_IplImage const*, bool)", referenced from: _main in main.o "cv::imread(std::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, int)", referenced from: _main in main.o ---SNIP--- ld: symbol(s) not found for architecture x86_64 collect2: ld returned 1 exit status Can anyone provide some guidance on what to try next? Thanks, Walt

    Read the article

  • How to troubleshoot dependencies not copying.

    - by AngryHacker
    I have an EXE project A, which references a class library project B (all in the same solution). Project B has a reference to about 10 3rd party DLLs (from DevExpress). All the referenced assemblies have Copy Local set to true. When I build the entire solution, the resulting DLL from project B is copied into the bin/debug of project A. However, none of the dependencies of project B get copied. I looked in the Output window during compile and all looks good. There aren't any errors. How do I troubleshoot this?

    Read the article

  • how to write unicode hello world in C on windows

    - by hatchetman82
    im tyring to get this to work: #define UNICODE #define _UNICODE #include <wchar.h> int main() { wprintf(L"Hello World!\n"); wprintf(L"£?, ?, ?!\n"); return 0; } using visual studio 2008 express (on windows xp, if it matters). when i run this from the command prompt (started as cmd /u which is supposed to enable unicode ?) i get this: C:\dev\unicodevs\unicodevs\Debugunicodevs.exe Hello World! -ú8 C:\dev\unicodevs\unicodevs\Debug which i suppose was to be expected given that the terminal does not have the font to render those. but what gets me is that even if i try this: C:\dev\unicodevs\unicodevs\Debugcmd /u /c "unicodevs.exe output.txt" the file produced (even though its UTF-8 encoded) looks like: Hello World! £ì the source file itself is defined as unicode (encoded in UTF-8 without BOM). the compiler output when building: 1------ Rebuild All started: Project: unicodevs, Configuration: Debug Win32 ------ 1Deleting intermediate and output files for project 'unicodevs', configuration 'Debug|Win32' 1Compiling... 1main.c 1.\main.c(1) : warning C4005: 'UNICODE' : macro redefinition 1 command-line arguments : see previous definition of 'UNICODE' 1.\main.c(2) : warning C4005: '_UNICODE' : macro redefinition 1 command-line arguments : see previous definition of '_UNICODE' 1Note: including file: C:\Program Files\Microsoft Visual Studio 9.0\VC\include\wchar.h 1Note: including file: C:\Program Files\Microsoft Visual Studio 9.0\VC\include\crtdefs.h 1Note: including file: C:\Program Files\Microsoft Visual Studio 9.0\VC\include\sal.h 1C:\Program Files\Microsoft Visual Studio 9.0\VC\include\sal.h(108) : warning C4001: nonstandard extension 'single line comment' was used 1Note: including file: C:\Program Files\Microsoft Visual Studio 9.0\VC\include\crtassem.h 1Note: including file: C:\Program Files\Microsoft Visual Studio 9.0\VC\include\vadefs.h 1Note: including file: C:\Program Files\Microsoft Visual Studio 9.0\VC\include\swprintf.inl 1Note: including file: C:\Program Files\Microsoft Visual Studio 9.0\VC\include\wtime.inl 1Linking... 1Embedding manifest... 1Creating browse information file... 1Microsoft Browse Information Maintenance Utility Version 9.00.30729 1Copyright (C) Microsoft Corporation. All rights reserved. 1Build log was saved at "file://c:\dev\unicodevs\unicodevs\unicodevs\Debug\BuildLog.htm" 1unicodevs - 0 error(s), 3 warning(s) ========== Rebuild All: 1 succeeded, 0 failed, 0 skipped ========== any ideas on what am i doing wrong ? similar questions on ST (like this one: http://stackoverflow.com/questions/787589/unicode-hello-world-for-c) seem to refer to *nix builds - as far as i understand setlocale() is not available for windows. i also tried building this using code::blocks/mingw gcc, but got the same results.

    Read the article

  • NHibernate exception on query

    - by Yoav
    I'm getting a mapping exception doing the most basic query. This is my domain class: public class Project { public virtual string PK { get; set; } public virtual string Id { get; set; } public virtual string Name { get; set; } public virtual string Description { get; set; } } And the mapping class: public class ProjectMap :ClassMap<Project> { public ProjectMap() { Table("PROJECTS"); Id(x => x.PK, "PK"); Map(x => x.Id, "ID"); Map(x => x.Name, "NAME"); Map(x => x.Description, "DESCRIPTION"); } } Configuration: public ISessionFactory SessionFactory { return Fluently.Configure() .Database(MsSqlCeConfiguration.Standard.ShowSql().ConnectionString(c => c.Is("data source=" + path))) .Mappings(m => m.FluentMappings.AddFromAssemblyOf<Project>()) .BuildSessionFactory(); } And query: IList project; using (ISession session = SessionFactory.OpenSession()) { IQuery query = session.CreateQuery("from Project"); project = query.List<Project>(); } I'm getting the exception on the query line: NHibernate.Hql.Ast.ANTLR.QuerySyntaxException: Project is not mapped [from Project] at NHibernate.Hql.Ast.ANTLR.SessionFactoryHelperExtensions.RequireClassPersister(String name) at NHibernate.Hql.Ast.ANTLR.Tree.FromElementFactory.AddFromElement() at NHibernate.Hql.Ast.ANTLR.Tree.FromClause.AddFromElement(String path, IASTNode alias) at NHibernate.Hql.Ast.ANTLR.HqlSqlWalker.fromElement() at NHibernate.Hql.Ast.ANTLR.HqlSqlWalker.fromElementList() at NHibernate.Hql.Ast.ANTLR.HqlSqlWalker.fromClause() at NHibernate.Hql.Ast.ANTLR.HqlSqlWalker.unionedQuery() at NHibernate.Hql.Ast.ANTLR.HqlSqlWalker.query() at NHibernate.Hql.Ast.ANTLR.HqlSqlWalker.selectStatement() at NHibernate.Hql.Ast.ANTLR.HqlSqlWalker.statement() at NHibernate.Hql.Ast.ANTLR.HqlSqlTranslator.Translate() at NHibernate.Hql.Ast.ANTLR.QueryTranslatorImpl.Analyze(HqlParseEngine parser, String collectionRole) at NHibernate.Hql.Ast.ANTLR.QueryTranslatorImpl.DoCompile(IDictionary`2 replacements, Boolean shallow, String collectionRole) at NHibernate.Hql.Ast.ANTLR.QueryTranslatorImpl.Compile(IDictionary`2 replacements, Boolean shallow) at NHibernate.Engine.Query.HQLQueryPlan..ctor(String hql, String collectionRole, Boolean shallow, IDictionary`2 enabledFilters, ISessionFactoryImplementor factory) at NHibernate.Engine.Query.QueryPlanCache.GetHQLQueryPlan(String queryString, Boolean shallow, IDictionary`2 enabledFilters) at NHibernate.Impl.AbstractSessionImpl.GetHQLQueryPlan(String query, Boolean shallow) at NHibernate.Impl.AbstractSessionImpl.CreateQuery(String queryString) I assume something is wrong with my query.

    Read the article

< Previous Page | 427 428 429 430 431 432 433 434 435 436 437 438  | Next Page >