Search Results

Search found 17616 results on 705 pages for 'report tools'.

Page 159/705 | < Previous Page | 155 156 157 158 159 160 161 162 163 164 165 166  | Next Page >

  • DB Object passing between classes singleton, static or other?

    - by Stephen
    So I'm designing a reporting system at work it's my first project written OOP and I'm stuck on the design choice for the DB class. Obviously I only want to create one instance of the DB class per-session/user and then pass it to each of the classes that need it. What I don't know it what's best practice for implementing this. Currently I have code like the following:- class db { private $user = 'USER'; private $pass = 'PASS'; private $tables = array( 'user','report', 'etc...'); function __construct(){ //SET UP CONNECTION AND TABLES } }; class report{ function __construct ($params = array(), $db, $user) { //Error checking/handling trimed //$db is the database object we created $this->db = $db; //$this->user is the user object for the logged in user $this->user = $user; $this->reportCreate(); } public function setPermission($permissionId = 1) { //Note the $this->db is this the best practise solution? $this->db->permission->find($permissionId) //Note the $this->user is this the best practise solution? $this->user->checkPermission(1) $data=array(); $this->db->reportpermission->insert($data) } };//end report I've been reading about using static classes and have just come across Singletons (though these appear to be passé already?) so what's current best practice for doing this?

    Read the article

  • Pig_Cassandra integration caused - ERROR 1070: Could not resolve CassandraStorage using imports:

    - by Le Dude
    I'm following basic Pig, Cassandra, Hadoop installation. Everything works just fine as a stand alone. No error. However when I tried to run the example file provided by Pig_cassandra example, I got this error. [root@localhost pig]# /opt/cassandra/apache-cassandra-1.1.6/examples/pig/bin/pig_cassandra -x local -x local /opt/cassandra/apache-cassandra-1.1.6/examples/pig/example-script.pig Using /opt/pig/pig-0.10.0/pig-0.10.0-withouthadoop.jar. 2012-10-24 21:14:58,551 [main] INFO org.apache.pig.Main - Apache Pig version 0.10.0 (r1328203) compiled Apr 19 2012, 22:54:12 2012-10-24 21:14:58,552 [main] INFO org.apache.pig.Main - Logging error messages to: /opt/pig/pig_1351138498539.log 2012-10-24 21:14:59,004 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: file:/// 2012-10-24 21:14:59,472 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1070: Could not resolve CassandraStorage using imports: [, org.apache.pig.builtin., org.apache.pig.impl.builtin.] Details at logfile: /opt/pig/pig_1351138498539.log Here is the log file Pig Stack Trace --------------- ERROR 1070: Could not resolve CassandraStorage using imports: [, org.apache.pig.builtin., org.apache.pig.impl.builtin.] org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1000: Error during parsing. Could not resolve CassandraStorage using imports: [, org.apache.pig.builtin., org.apache.pig.impl.builtin.] at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1597) at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1540) at org.apache.pig.PigServer.registerQuery(PigServer.java:540) at org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:970) at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:386) at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:189) at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:165) at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84) at org.apache.pig.Main.run(Main.java:555) at org.apache.pig.Main.main(Main.java:111) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.util.RunJar.main(RunJar.java:156) Caused by: Failed to parse: Cannot instantiate: CassandraStorage at org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:184) at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1589) ... 14 more Caused by: java.lang.RuntimeException: Cannot instantiate: CassandraStorage at org.apache.pig.impl.PigContext.instantiateFuncFromSpec(PigContext.java:510) at org.apache.pig.parser.LogicalPlanBuilder.validateFuncSpec(LogicalPlanBuilder.java:791) at org.apache.pig.parser.LogicalPlanBuilder.buildFuncSpec(LogicalPlanBuilder.java:780) at org.apache.pig.parser.LogicalPlanGenerator.func_clause(LogicalPlanGenerator.java:4583) at org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3115) at org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1291) at org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:789) at org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:507) at org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:382) at org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:175) ... 15 more Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR 1070: Could not resolve CassandraStorage using imports: [, org.apache.pig.builtin., org.apache.pig.impl.builtin.] at org.apache.pig.impl.PigContext.resolveClassName(PigContext.java:495) at org.apache.pig.impl.PigContext.instantiateFuncFromSpec(PigContext.java:507) ... 24 more ================================================================================ I googled around and got to this point from other stackoverflow user that identified the potential problem but not the solution. Cassandra and pig integration cause error during startup I believe my configuration is correct and the path has already been defined properly. I didn't change anything in the pig_cassandra file. I'm not quite sure how to proceed from here. Please help?

    Read the article

  • ESXI guests not using available CPU resources

    - by Alan M
    I have a VMWare ESXI 5.0.0 (it's a bit old, I know) host with three guest VMs on it. For reasons unknown, the guests will not use much of the availble CPU resources. I have all three guests in a single pool, with the hosts all configured to use the same amount of resource shares, so they're basically 33% each. The three guests are basically identically configured as far as their VM resources go. So the problem is, even when the guests are performing what should be very 'busy' activitites, such as at bootup, the actual host CPU consumed is something tiny, like 33mhz, when seen via vSphere console's "Virtual Machines" tab when viewing properties for the pool. And of course, the performance of the guest VMs is terrible. The host has plenty of CPU to spare. I've tried tinkering with individual guest VM resource settings; cranking up the reservation, etc. No matter. The guests just refuse to make use of the abundant CPU available to them, and insist on using a sliver of the available resources. Any suggestions? Update after reading various comments below Per the suggestions below, I did remove the guests from the application pool; this didn't make any difference. I do understand that the guests are not going to consume resources they do not need. I have tried to do a remote perfmon on the guest which is experiencing long boot times, but I cannot connect to the guest remotely with perfmon (guest is w2k8r2 server). Host graphs for CPU, Mem, Disk are basically flatlining; very little demand. Same is true for the guest stats; while the guest itself seems to be crawling, the guest resource graphing shows very little activity across CPU, Mem, Disk. Host is a Dell PowerEdge 2900, has 2 physical CPU,20gb RAM. (it's a test/dev environment using surplus gear) Guest1 has: VM ver. 7, 2vCPU, 4gb RAM, 140gb storage which lives on a RAID-5 array on the host. Guest2 has: VM ver. 7, 2vCPU, 4gb RAM, 140gb storage which lives on a RAID-5 array on the host. Guest3 has: VM ver. 7, 1vCPU, 2gb RAM, 2tb storage which lives on a RAID-5 ISCSI NAS box Perhaps I am making a false assumption that if a guest has a demand for CPU (e.g. Windows Task Manager shows 100% CPU), the host would supply the guest with more CPU (mem, disk) on demand. Another Update After checking the stats, it would appear that the host is indeed not busy at all, neither is the guest. I believe I have a good idea on the issue, though; a messed-up VMWare Tools install. The guest has VMware Tools on it, but the host says it does not. VMWare Tools refuses to uninstall, refuses to be upgraded, refuses to be recognized. While I cannot say with authority, this would appear to be something worth investigation. I do not know the origin of the guest itself, nor the specifics on the original VMWare Tools install. Following various bits of googling, I did come up with a few suggestions that went nowhere. To that end, I was going to delete this question, but was prompted not to do so since so many folks answered. My suspicion right now is; the problem truly is the guest; the guest is not making a demand on the host, and as a natural result, the host is treating the guest accordingly. My Final Update I am 99% certain the guest VM had something fundamentally wrong with it re VMWare Tools. I created a clone of a different VM with a near-identical OS config, but a properly working install of VMWare tools. The guest runs just great, and takes up it's allotment of resources when it needs to; e.g. it eats up about 850mhz CPU during startup, then ticks down to idle once the guest OS is stable.

    Read the article

  • Looking for advice on Hyper-v storage replication

    - by Notre1
    I am designing a 2-host Hyper-V R2 cluster with 6-10 guests stored on a SMB iSCSI SAN device (probably Promise VessRAID). I will be getting at least two of the SAN devices and need to eliminate the storage a single point of failure. Ideally, that would involve real-time failover for the storage, like the Windows failover clustering does for the hosts. This design will be used at around six of our sites, and I would like to allow for us to eventually setup a cluster at colocation site and replicate each site's VMs there for DR. (Ideally a live multi-site cluster, but a manual import of the VMs would be fine for this sort of DR.) The tools that come with enterprise SANs, like EMC and NetApp, seem to be the most commonly used items for a Hyper-V cluster, but I can't afford their prices with my budget. Outside of them, the two tools that seem to be most common for Hyper-V storage replication are SteelEye (now SIOS) DataKeeper Cluster Edition and Double-Take Availability. Originally, I was planning on using Clustered Shared Volume(s) (CSV), but it seems like replication support for these is either not available or brand new in both these products. It looks like CSVs are supported in Double-Take 5.22, see this discussion, but I don't think I want to run something that new in production. Right now, it seems like the best option for me is not to implement CSVs, implement some sort of storage replication, and upgrade to CSVs at a later date once replicating them is more mature. I would love to have live migration, and CSVs are not required for live migration if you are using one LUN per VM, so I guess this is what I'll do. I would prefer to stick to the using the Microsoft Windows Server and Hyper-V tools and features as much as possible. From that standpoint, SteelEye looks more appealing than Double-Take because they make the DataKeeper volume(s) available to the Failover Clustering Manager and then failover clustering is all configured and managed through the native Microsoft tools. Double-Take says that "clustered Hyper-V hosts are not supported," and Double-Take Availability itself seems to be what is used for the actual clustering and failover. Does anyone know if any of these replication tools work with more than two hosts in the cluster? All the information I can find on the web only uses two hosts in their examples. Are there any better tools than SteelEye and Double-Take for doing what I am trying to do, which is eliminate the storage as as single point of failure? Neverfail, AppAssure, and DataCore all seem to offer similar functionality, but they don't seems to be as popular as SteelEye and Double-Take. I have seen a number of people suggest using Starwind iSCSI SAN software for the shared storage, which includes replication (and CSV replication at that). There are a couple of reasons I have not seriously considered this route: 1) The company I work for is exclusively a Dell shop and Dell does not have any servers with that I can pack with more than six 3.5" SATA drives. 2) In the future, it could be advantegous for us to not be locked into a particular brand or type of storage and third-party replication softwares all allow replication to heterogeneous storage devices. I am pretty new to iSCSI and clustering, so please let me know if it looks like I am planning something that goes against best practices or overlooking/missing something.

    Read the article

  • Looking for advice on Hyper-v storage replication

    - by Notre1
    I am designing a 2-host Hyper-V R2 cluster with 6-10 guests stored on a SMB iSCSI SAN device (probably Promise VessRAID). I will be getting at least two of the SAN devices and need to eliminate the storage a single point of failure. Ideally, that would involve real-time failover for the storage, like the Windows failover clustering does for the hosts. This design will be used at around six of our sites, and I would like to allow for us to eventually setup a cluster at colocation site and replicate each site's VMs there for DR. (Ideally a live multi-site cluster, but a manual import of the VMs would be fine for this sort of DR.) The tools that come with enterprise SANs, like EMC and NetApp, seem to be the most commonly used items for a Hyper-V cluster, but I can't afford their prices with my budget. Outside of them, the two tools that seem to be most common for Hyper-V storage replication are SteelEye (now SIOS) DataKeeper Cluster Edition and Double-Take Availability. Originally, I was planning on using Clustered Shared Volume(s) (CSV), but it seems like replication support for these is either not available or brand new in both these products. It looks like CSVs are supported in Double-Take 5.22, see this discussion, but I don't think I want to run something that new in production. Right now, it seems like the best option for me is not to implement CSVs, implement some sort of storage replication, and upgrade to CSVs at a later date once replicating them is more mature. I would love to have live migration, and CSVs are not required for live migration if you are using one LUN per VM, so I guess this is what I'll do. I would prefer to stick to the using the Microsoft Windows Server and Hyper-V tools and features as much as possible. From that standpoint, SteelEye looks more appealing than Double-Take because they make the DataKeeper volume(s) available to the Failover Clustering Manager and then failover clustering is all configured and managed through the native Microsoft tools. Double-Take says that "clustered Hyper-V hosts are not supported," and Double-Take Availability itself seems to be what is used for the actual clustering and failover. Does anyone know if any of these replication tools work with more than two hosts in the cluster? All the information I can find on the web only uses two hosts in their examples. Are there any better tools than SteelEye and Double-Take for doing what I am trying to do, which is eliminate the storage as as single point of failure? Neverfail, AppAssure, and DataCore all seem to offer similar functionality, but they don't seems to be as popular as SteelEye and Double-Take. I have seen a number of people suggest using Starwind iSCSI SAN software for the shared storage, which includes replication (and CSV replication at that). There are a couple of reasons I have not seriously considered this route: 1) The company I work for is exclusively a Dell shop and Dell does not have any servers with that I can pack with more than six 3.5" SATA drives. 2) In the future, it could be advantegous for us to not be locked into a particular brand or type of storage and third-party replication softwares all allow replication to heterogeneous storage devices. I am pretty new to iSCSI and clustering, so please let me know if it looks like I am planning something that goes against best practices or overlooking/missing something.

    Read the article

  • Force SSRS 2008 to use SSRS 2005 CSV rendering

    - by Kash
    We are upgrading our report server from SSRS 2005 to SSRS 2008 R2. I have an issue with CSV export rendering for SSRS 2008 where the SUM of columns are appearing on the right side of the detail values in 2008 instead of the left side like in 2005 as shown in the below blocks. 117 and 131 are the sums of Column2 and Column3 respectively. SSRS 2005 CSV Output Column2_1,Column3_1,Column2,Column3 117,131,1,2 117,131,1,2 117,131,60,23 117,131,30,15 117,131,25,89 SSRS 2008 CSV Output Column2,Column3,Column2_1,Column3_1 1,2,117,131 1,2,117,131 60,23,117,131 30,15,117,131 25,89,117,131 I understand that the CSV renderer has gone through major changes in SSRS 2008 R2 with the support for charts and gauges and more importantly it provides 2 modes: the default Excel mode and Compliant mode. But neither mode helps fix this issue. The Compliant mode was supposed to be closest to that of 2005 but apparently it is not close enough for my case. My Question: Is there a way to force SSRS 2008 fall back a report to a backward compatibility mode so that it exports into a 2005 CSV format? Solution tried: a) Using 2005-based CRIs Based on this article on ExecutionLog2, if SSRS 2008 R2 encounters a report whose auto-upgrade is not possible (e.g. reports that were built with 2005-based CustomReportItem controls), those particular reports will be processed with the old Yukon engine in a "transparent backwards-compatibility mode". It seems like it falls back to its previous version mode (2005) and attempts to render it. So I tried using a 2005-based barcode CustomReportItem and deployed to a SSRS 2008 R2 report server, but it shows the same result as before though it suppressed the barcode. This would be because SSRS 2008 R2 finds a way to suppress part of the report output and displays the rest. It would be great to find a 2005-based CRI that makes SSRS 2008 R2 process it with its old Yukon engine. Please note that quite possibly, even if it uses the "old Yukon processing engine", it might still use the new CSV renderer hence it shows the same output. If that is true, then this option is moot. b) Using XML renderer We can use a custom XML renderer and then use XSLT to convert the xml to appropriate CSV but this would mean that we need to convert all our 200 reports. Hence this is not feasible. Please note that we do not have the option of having SSRS 2005 and SSRS 2008 R2 deployed side by side.

    Read the article

  • .NET RegEx "Memory Leak" investigation

    - by Kevin Pullin
    I recently looked into some .NET "memory leaks" (i.e. unexpected, lingering GC rooted objects) in a WinForms app. After loading and then closing a huge report, the memory usage did not drop as expected even after a couple of gen2 collections. Assuming that the reporting control was being kept alive by a stray event handler I cracked open WinDbg to see what was happening... Using WinDbg, the !dumpheap -stat command reported a large amount of memory was consumed by string instances. Further refining this down with the !dumpheap -type System.String command I found the culprit, a 90MB string used for the report, at address 03be7930. The last step was to invoke !gcroot 03be7930 to see which object(s) were keeping it alive. My expectations were incorrect - it was not an unhooked event handler hanging onto the reporting control (and report string), but instead it was held on by a System.Text.RegularExpressions.RegexInterpreter instance, which itself is a descendant of a System.Text.RegularExpressions.CachedCodeEntry. Now, the caching of Regexs is (somewhat) common knowledge as this helps to reduce the overhead of having to recompile the Regex each time it is used. But what then does this have to do with keeping my string alive? Based on analysis using Reflector, it turns out that the input string is stored in the RegexInterpreter whenever a Regex method is called. The RegexInterpreter holds onto this string reference until a new string is fed into it by a subsequent Regex method invocation. I'd expect similar behaviour by hanging onto Regex.Match instances and perhaps others. The chain is something like this: Regex.Split, Regex.Match, Regex.Replace, etc Regex.Run RegexScanner.Scan (RegexScanner is the base class, RegexInterpreter is the subclass described above). The offending Regex is only used for reporting, rarely used, and therefore unlikely to be used again to clear out the existing report string. And even if the Regex was used at a later point, it would probably be processing another large report. This is a relatively significant problem and just plain feels dirty. All that said, I found a few options on how to resolve, or at least work around, this scenario. I'll let the community respond first and if no takers come forward I will fill in any gaps in a day or two.

    Read the article

  • What is the best way to archive data in a relational database?

    - by GenericTypeTea
    I have a bit of an issue with a particular aspect of a program I'm working on. I need the ability to archive (fix) a table so that a change anywhere in the system will not affect the results it returns. This is the basic structure of what I need to fix: Recipe --> Recipe (as sub recipe) Recipe --> Ingredients So, if I fix a Recipe, I need to ensure all the sub recipes (including all the sub recipes sub recipes and so forth) are fixed and all its ingredients are fixed. The problem is that the sub recipe and ingredients still need to be modifiable as they are used by other recipes that are not fixed. I came up with a solution whereby I serialize (with protobuf-net) a master object that deals with the recipe and all the sub recipes and ingredients and save the archive data to a table like follows: Archive{ ReferenceId, (i.e. RecipeId) ReferenceTypeId, (i.e. Recipe) ArchiveData varbinary(max) } Now, this works great and is almost perfect... however I totally forgot (I'd love to blame the agile development mentally, however this was just short sighted) that this information needs to be reported on. As far as I'm aware I can't think how I could inflate the serialized data back into my Recipe Object and use it in a Report. I'm using the standard SQL 2005 report services at the moment. Alternatively, I guess I could do the following: Duplicate every table and tag the word "Archive" on the end of the table name. This would then give me an area of specific archive data... but ignoring my simplified example, there'd actually be about 15 tables duplicated. Add a nullable, non-foreign key property called "CopiedFromId" to every table that contains fixed data and duplicate every record that the recipe (and all it's sub recipes and all their sub recipes) touches. Create some sort of denormalised structure that could be restored from at a later date to the original, unfixed recipe. Although I think this would be like option 1 and involve a lot of extra tables. Anyway, I'm at a total loss and do not like any of the ideas particularly. Can anyone please advise the best course of action? EDIT: Or 4) Create tables specific to what the report requires and populate them with the data when the user clicks the report button? This would cause about 4 extra tables for the report in question.

    Read the article

  • Cocoa XML reader app

    - by Miskia
    Hello, I'm a newbie to Cocoa, just develop some little apps with C/C++ on Windows. I want to make a "simple" app on Cocoa. When the user specific XML file, the file nodes are represented "enduser viewable". I made an interface with some NSTextField. I made a subclass of NSDocument called "XMLFile" so i got "XMLFile.h" and "XMLFile.m" in my Xcode project. In the plist of my app i setup a new "Document Types": XML File - extensions: xml - role: view - class: XMLFile - store type: XML Here is my "XMLFile.h": #import <Cocoa/Cocoa.h> @interface FichierXML : NSDocument { } IBOutlet NSTextField *dateField; IBOutlet NSTextField *titleField; IBOutlet NSTextField *descField; IBOutlet NSTextField *vidfileField; IBOutlet NSTextField *imgfileField; IBOutlet NSObjectController *object; NSUInteger *mask; @end And here is my "XMLFile.m": #import "XMLFile.h" @implementation XMLFile - (BOOL)readFromData:(NSData *)datafile ofType:(NSString *)typeName error:(NSError **)outerror { NSMutableArray* ReportCreationDate = [[NSMutableArray alloc] initWithCapacity:10]; NSMutableArray* ReportTitle = [[NSMutableArray alloc] initWithCapacity:10]; NSMutableArray* ReportDescription = [[NSMutableArray alloc] initWithCapacity:10]; NSMutableArray* VideoPath = [[NSMutableArray alloc] initWithCapacity:10]; NSMutableArray* VideoThumbnailImageName = [[NSMutableArray alloc] initWithCapacity:10]; NSXMLDocument* doc = [[NSXMLDocument alloc] initWithData:datafile options:mask error:outerror]; NSXMLElement* root = [doc rootElement]; NSArray* dateElement = [root nodesForXPath:@"//Report/ReportCreationDate" error:nil]; for(NSXMLElement* xmlElement in dateElement) [dateElement setStringValue:[xmlElement stringValue]]; NSArray* titleElement = [root nodesForXPath:@"//Report/ReportTitle" error:nil]; for(NSXMLElement* xmlElement in titleElement) [titleField setStringValue:[xmlElement stringValue]]; NSArray* descElement = [root nodesForXPath:@"//Report/ReportDescription" error:nil]; for(NSXMLElement* xmlElement in descElement) [descField setStringValue:[xmlElement stringValue]]; NSArray* vidfileElement = [root nodesForXPath:@"//Report/Videos/Video/VideoPath" error:nil]; for(NSXMLElement* xmlElement in vidfileElement) [vidfileField setStringValue:[xmlElement stringValue]]; NSArray* imgfileElement = [root nodesForXPath:@"//Report/Videos/Video/VideoThumbnailImageName" error:nil]; for(NSXMLElement* xmlElement in imgfileElement) [imgfileField setStringValue:[xmlElement stringValue]]; [doc release]; [ReportCreationDate release]; [ReportTitle release]; [ReportDescription release]; [VideoPath release]; [VideoThumbnailImageName release]; return YES; } @end So. The user open the XMLFile, and XMLDocument analyse the file to extract nodes' data and send it to the differents NSTextField... But it doesn't work :( If someone can help me... I'm a newbie so don't be too rude if I made big mistakes :) Miskia.

    Read the article

  • How to created filtered reports in WPF?

    - by Michael Goyote
    Creating reports in WPF. I have two related tables. Table A-Customer: CustomerID(PK) Names Phone Number Customer Num Table B-Items: Products Price CustomerID I want to be able to generate a report like this: CustomerA Items Price Item A 10 Item B 10 Item C 10 --------------- Total 30 So this is what I have done: <Window x:Class="ReportViewerWPF.MainWindow" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:rv="clr-namespace:Microsoft.Reporting.WinForms; assembly=Microsoft.ReportViewer.WinForms" Title="Customer Report" Height="300" Width="400"> <Grid> <WindowsFormsHost Name="windowsFormsHost1"> <rv:ReportViewer x:Name="reportViewer1"/> </WindowsFormsHost> </Grid> Then I created a dataset and loaded the two tables, followed by a report wizard (dragged all the available fields and dropped them to the Values pane). The code behind the WPF window is this: public partial class CustomerReport : Window { public CustomerReport() { InitializeComponent(); _reportViewer.Load += ReportViewer_Load; } private bool _isReportViewerLoaded; private void ReportViewer_Load(object sender, EventArgs e) { if (!_isReportViewerLoaded) { Microsoft.Reporting.WinForms.ReportDataSource reportDataSource1 = new Microsoft.Reporting.WinForms.ReportDataSource(); HM2DataSet dataset = new HM2DataSet(); dataset.BeginInit(); reportDataSource1.Name = "DataSet";//This is the dataset name reportDataSource1.Value = dataset.CustomerTable; this.reportViewer1.LocalReport.DataSources.Add(reportDataSource1); this.reportViewer1.LocalReport.ReportPath = "../../Report3.rdlc"; dataset.EndInit(); HM2DataSetTableAdapters.CustomerTableAdapter funcTableAdapter = new HM2DataSetTableAdapters.CustomerTableAdapter(); funcTableAdapter.ClearBeforeFill = true; funcTableAdapter.Fill(dataset.CustomerTable); _reportViewer.RefreshReport(); _isReportViewerLoaded = true; } } As you might have guessed this loaded this list of customer with items and price: Customer Items Price Customer A Items A 10 Customer A Items B 10 Customer B Items D 10 Customer B Items C 10 How can I fine-tune this report to look like the one above, where the user can filter the customer he wants displayed on the report? Thanks in advance for the help. I would have preferred to use LINQ whenever filtering data

    Read the article

  • StreamWriter appends random data

    - by void
    Hi I'm seeing odd behaviour using the StreamWriter class writing extra data to a file using this code: public void WriteToCSV(string filename) { StreamWriter streamWriter = null; try { streamWriter = new StreamWriter(filename); Log.Info("Writing CSV report header information ... "); streamWriter.WriteLine("\"{0}\",\"{1}\",\"{2}\",\"{3}\"", ((int)CSVRecordType.Header).ToString("D2", CultureInfo.CurrentCulture), m_InputFilename, m_LoadStartDate, m_LoadEndDate); int recordCount = 0; if (SummarySection) { Log.Info("Writing CSV report summary section ... "); foreach (KeyValuePair<KeyValuePair<LoadStatus, string>, CategoryResult> categoryResult in m_DataLoadResult.DataLoadResults) { streamWriter.WriteLine("\"{0}\",\"{1}\",\"{2}\",\"{3}\"", ((int)CSVRecordType.Summary).ToString("D2", CultureInfo.CurrentCulture), categoryResult.Value.StatusString, categoryResult.Value.Count.ToString(CultureInfo.CurrentCulture), categoryResult.Value.Category); recordCount++; } } Log.Info("Writing CSV report cases section ... "); foreach (KeyValuePair<KeyValuePair<LoadStatus, string>, CategoryResult> categoryResult in m_DataLoadResult.DataLoadResults) { foreach (CaseLoadResult result in categoryResult.Value.CaseLoadResults) { if ((LoadStatus.Success == result.Status && SuccessCases) || (LoadStatus.Warnings == result.Status && WarningCases) || (LoadStatus.Failure == result.Status && FailureCases) || (LoadStatus.NotProcessed == result.Status && NotProcessedCases)) { streamWriter.Write("\"{0}\",\"{1}\",\"{2}\",\"{3}\",\"{4}\"", ((int)CSVRecordType.Result).ToString("D2", CultureInfo.CurrentCulture), result.Status, result.UniqueId, result.Category, result.ClassicReference); if (RawResponse) { streamWriter.Write(",\"{0}\"", result.ResponseXml); } streamWriter.WriteLine(); recordCount++; } } } streamWriter.WriteLine("\"{0}\",\"{1}\"", ((int)CSVRecordType.Count).ToString("D2", CultureInfo.CurrentCulture), recordCount); Log.Info("CSV report written to '{0}'", fileName); } catch (IOException execption) { string errorMessage = string.Format(CultureInfo.CurrentCulture, "Unable to write XML report to '{0}'", fileName); Log.Error(errorMessage); Log.Error(exception.Message); throw new MyException(errorMessage, exception); } finally { if (null != streamWriter) { streamWriter.Close(); } } } The file produced contains a set of records on each line 0 to N, for example: [Record Zero] [Record One] ... [Record N] However the file produced either contains nulls or incomplete records from further up the file appended to the end. For example: [Record Zero] [Record One] ... [Record N] [Lots of nulls] or [Record Zero] [Record One] ... [Record N] [Half complete records] This also happens in separate pieces of code that also use the StreamWriter class. Furthermore, the files produced all have sizes that are multiples of 1024. I've been unable to reproduce this behaviour on any other machine and have tried recreating the environment. Previous versions of the application didn't exhibite this behaviour despite having the same code for the methods in question. EDIT: Added extra code.

    Read the article

  • Linux Mint 10 LXDE computer to act as LTSP Server without luck

    - by Rautamiekka
    So I've tried to make our screen-broken HP laptop to also serve as LTSP Server in addition to various other tasks, without luck, which may be cuz I'm running LM10 LXDE while the instructions are for Ubuntu. Excuse my ignorance. The entire output from Terminal after installing LTSP stuff along with a Server kernel and a load of other packages: administrator@rauta-mint-turion ~ $ sudo lt ltrace ltsp-build-client ltsp-chroot ltspfs ltspfsmounter ltsp-info ltsp-localapps ltsp-update-image ltsp-update-kernels ltsp-update-sshkeys administrator@rauta-mint-turion ~ $ sudo ltsp-update-sshkeys administrator@rauta-mint-turion ~ $ sudo ltsp-update-kernels find: `/opt/ltsp/': No such file or directory administrator@rauta-mint-turion ~ $ sudo ltsp-build-client /usr/share/ltsp/plugins/ltsp-build-client/common/010-chroot-tagging: line 3: /opt/ltsp/i386/etc/ltsp_chroot: No such file or directory error: LTSP client installation ended abnormally administrator@rauta-mint-turion ~ $ sudo ltsp-update-image Cannot determine assigned port. Assigning to port 2000. mkdir: cannot create directory `/opt/ltsp/i386/etc/ltsp': No such file or directory /usr/sbin/ltsp-update-image: 274: cannot create /opt/ltsp/i386/etc/ltsp/update-kernels.conf: Directory nonexistent /usr/sbin/ltsp-update-image: 274: cannot create /opt/ltsp/i386/etc/ltsp/update-kernels.conf: Directory nonexistent Regenerating kernel... chroot: failed to run command `/usr/share/ltsp/update-kernels': No such file or directory Done. Configuring inetd... Done. Updating pxelinux default configuration...Done. Skipping invalid chroot: /opt/ltsp/i386 chroot: failed to run command `test': No such file or directory administrator@rauta-mint-turion ~ $ sudo ltsp-chroot chroot: failed to run command `/bin/bash': No such file or directory administrator@rauta-mint-turion ~ $ bash administrator@rauta-mint-turion ~ $ exit exit administrator@rauta-mint-turion ~ $ sudo ls /opt/ltsp i386 administrator@rauta-mint-turion ~ $ sudo ls /opt/ltsp/i386/ administrator@rauta-mint-turion ~ $ sudo ltsp-build-client NOTE: Root directory /opt/ltsp/i386 already exists, this will lead to problems, please remove it before trying again. Exiting. error: LTSP client installation ended abnormally administrator@rauta-mint-turion ~ $ sudo rm -rv /opt/ltsp/i386 removed directory: `/opt/ltsp/i386' administrator@rauta-mint-turion ~ $ sudo ltsp-build-client /usr/share/ltsp/plugins/ltsp-build-client/common/010-chroot-tagging: line 3: /opt/ltsp/i386/etc/ltsp_chroot: No such file or directory error: LTSP client installation ended abnormally administrator@rauta-mint-turion ~ $ aptitude search ltsp p fts-ltsp-ldap - LDAP LTSP module for the TFTP/Fuse supplicant p ltsp-client - LTSP client environment p ltsp-client-core - LTSP client environment (core) p ltsp-cluster-accountmanager - Account creation and management daemon for LTSP p ltsp-cluster-control - Web based thin-client configuration management p ltsp-cluster-lbagent - LTSP loadbalancer agent offers variables about the state of the ltsp server p ltsp-cluster-lbserver - LTSP loadbalancer server returns the optimal ltsp server to terminal p ltsp-cluster-nxloadbalancer - Minimal NX loadbalancer for ltsp-cluster p ltsp-cluster-pxeconfig - LTSP-Cluster symlink generator p ltsp-controlaula - Classroom management tool with ltsp clients p ltsp-docs - LTSP Documentation p ltsp-livecd - starts an LTSP live server on an Ubuntu livecd session p ltsp-manager - Ubuntu LTSP server management GUI i A ltsp-server - Basic LTSP server environment i ltsp-server-standalone - Complete LTSP server environment i A ltspfs - Fuse based remote filesystem for LTSP thin clients p ltspfsd - Fuse based remote filesystem hooks for LTSP thin clients p ltspfsd-core - Fuse based remote filesystem daemon for LTSP thin clients p python-ltsp - provides ltsp related functions administrator@rauta-mint-turion ~ $ sudo aptitude purge ltsp-server ltsp-server-standalone ltspfs The following packages will be REMOVED: debconf-utils{u} debootstrap{u} dhcp3-server{u} gstreamer0.10-pulseaudio{u} ldm-server{u} libpulse-browse0{u} ltsp-server{p} ltsp-server-standalone{p} ltspfs{p} nbd-server{u} openbsd-inetd{u} pulseaudio{u} pulseaudio-esound-compat{u} pulseaudio-module-x11{u} pulseaudio-utils{u} squashfs-tools{u} tftpd-hpa{u} 0 packages upgraded, 0 newly installed, 17 to remove and 0 not upgraded. Need to get 0B of archives. After unpacking 6,996kB will be freed. Do you want to continue? [Y/n/?] (Reading database ... 158454 files and directories currently installed.) Removing ltsp-server-standalone ... Purging configuration files for ltsp-server-standalone ... Removing ltsp-server ... Purging configuration files for ltsp-server ... dpkg: warning: while removing ltsp-server, directory '/var/lib/tftpboot/ltsp' not empty so not removed. dpkg: warning: while removing ltsp-server, directory '/var/lib/tftpboot' not empty so not removed. Processing triggers for man-db ... (Reading database ... 158195 files and directories currently installed.) Removing debconf-utils ... Removing debootstrap ... Removing dhcp3-server ... * Stopping DHCP server dhcpd3 [ OK ] Removing gstreamer0.10-pulseaudio ... Removing ldm-server ... Removing pulseaudio-module-x11 ... Removing pulseaudio-esound-compat ... Removing pulseaudio ... * PulseAudio configured for per-user sessions Removing pulseaudio-utils ... Removing libpulse-browse0 ... Processing triggers for man-db ... Processing triggers for ureadahead ... ureadahead will be reprofiled on next reboot Processing triggers for libc-bin ... ldconfig deferred processing now taking place (Reading database ... 157944 files and directories currently installed.) Removing ltspfs ... Processing triggers for man-db ... (Reading database ... 157932 files and directories currently installed.) Removing nbd-server ... Stopping Network Block Device server: nbd-server. Removing openbsd-inetd ... * Stopping internet superserver inetd [ OK ] Removing squashfs-tools ... Removing tftpd-hpa ... tftpd-hpa stop/waiting Processing triggers for ureadahead ... Processing triggers for man-db ... administrator@rauta-mint-turion ~ $ sudo aptitude purge ~c The following packages will be REMOVED: dhcp3-server{p} libpulse-browse0{p} nbd-server{p} openbsd-inetd{p} pulseaudio{p} tftpd-hpa{p} 0 packages upgraded, 0 newly installed, 6 to remove and 0 not upgraded. Need to get 0B of archives. After unpacking 0B will be used. Do you want to continue? [Y/n/?] (Reading database ... 157881 files and directories currently installed.) Removing dhcp3-server ... Purging configuration files for dhcp3-server ... Removing libpulse-browse0 ... Purging configuration files for libpulse-browse0 ... Removing nbd-server ... Purging configuration files for nbd-server ... Removing openbsd-inetd ... Purging configuration files for openbsd-inetd ... Removing pulseaudio ... Purging configuration files for pulseaudio ... Removing tftpd-hpa ... Purging configuration files for tftpd-hpa ... Processing triggers for ureadahead ... administrator@rauta-mint-turion ~ $ sudo aptitude install ltsp-server-standalone The following NEW packages will be installed: debconf-utils{a} debootstrap{a} ldm-server{a} ltsp-server{a} ltsp-server-standalone ltspfs{a} nbd-server{a} openbsd-inetd{a} squashfs-tools{a} The following packages are RECOMMENDED but will NOT be installed: dhcp3-server pulseaudio-esound-compat tftpd-hpa 0 packages upgraded, 9 newly installed, 0 to remove and 0 not upgraded. Need to get 0B/498kB of archives. After unpacking 2,437kB will be used. Do you want to continue? [Y/n/?] Preconfiguring packages ... Selecting previously deselected package openbsd-inetd. (Reading database ... 157868 files and directories currently installed.) Unpacking openbsd-inetd (from .../openbsd-inetd_0.20080125-4ubuntu2_i386.deb) ... Processing triggers for man-db ... Processing triggers for ureadahead ... Setting up openbsd-inetd (0.20080125-4ubuntu2) ... * Stopping internet superserver inetd [ OK ] * Starting internet superserver inetd [ OK ] Selecting previously deselected package ldm-server. (Reading database ... 157877 files and directories currently installed.) Unpacking ldm-server (from .../ldm-server_2%3a2.1.3-0ubuntu1_all.deb) ... Selecting previously deselected package debconf-utils. Unpacking debconf-utils (from .../debconf-utils_1.5.32ubuntu3_all.deb) ... Selecting previously deselected package debootstrap. Unpacking debootstrap (from .../debootstrap_1.0.23ubuntu1_all.deb) ... Selecting previously deselected package nbd-server. Unpacking nbd-server (from .../nbd-server_1%3a2.9.14-2ubuntu1_i386.deb) ... Selecting previously deselected package squashfs-tools. Unpacking squashfs-tools (from .../squashfs-tools_1%3a4.0-8_i386.deb) ... Selecting previously deselected package ltsp-server. GNU nano 2.2.4 File: /etc/ltsp/ltsp-update-image.conf # Configuration file for ltsp-update-image # By default, do not compress the image # as it's reported to make it unstable NO_COMP="-noF -noD -noI -no-exports" [ Switched to /etc/ltsp/ltsp-update-image.conf ] administrator@rauta-mint-turion ~ $ ls /opt/ firefox/ ltsp/ mint-flashplugin/ administrator@rauta-mint-turion ~ $ ls /opt/ltsp/i386/ administrator@rauta-mint-turion ~ $ ls /opt/ltsp/ i386 administrator@rauta-mint-turion ~ $ sudo ltsp ltsp-build-client ltsp-chroot ltspfs ltspfsmounter ltsp-info ltsp-localapps ltsp-update-image ltsp-update-kernels ltsp-update-sshkeys administrator@rauta-mint-turion ~ $ sudo ltsp-build-client NOTE: Root directory /opt/ltsp/i386 already exists, this will lead to problems, please remove it before trying again. Exiting. error: LTSP client installation ended abnormally administrator@rauta-mint-turion ~ $ ^C administrator@rauta-mint-turion ~ $ sudo aptitude purge ltsp ltspfs ltsp-server ltsp-server-standalone administrator@rauta-mint-turion ~ $ sudo aptitude purge ltsp-server The following packages will be REMOVED: ltsp-server{p} 0 packages upgraded, 0 newly installed, 1 to remove and 0 not upgraded. Need to get 0B of archives. After unpacking 1,073kB will be freed. The following packages have unmet dependencies: ltsp-server-standalone: Depends: ltsp-server but it is not going to be installed. The following actions will resolve these dependencies: Remove the following packages: 1) ltsp-server-standalone Accept this solution? [Y/n/q/?] The following packages will be REMOVED: debconf-utils{u} debootstrap{u} ldm-server{u} ltsp-server{p} ltsp-server-standalone{a} ltspfs{u} nbd-server{u} openbsd-inetd{u} squashfs-tools{u} 0 packages upgraded, 0 newly installed, 9 to remove and 0 not upgraded. Need to get 0B of archives. After unpacking 2,437kB will be freed. Do you want to continue? [Y/n/?] (Reading database ... 158244 files and directories currently installed.) Removing ltsp-server-standalone ... (Reading database ... 158240 files and directories currently installed.) Removing ltsp-server ... Purging configuration files for ltsp-server ... dpkg: warning: while removing ltsp-server, directory '/var/lib/tftpboot/ltsp' not empty so not removed. dpkg: warning: while removing ltsp-server, directory '/var/lib/tftpboot' not empty so not removed. Processing triggers for man-db ... (Reading database ... 157987 files and directories currently installed.) Removing debconf-utils ... Removing debootstrap ... Removing ldm-server ... Removing ltspfs ... Removing nbd-server ... Stopping Network Block Device server: nbd-server. Removing openbsd-inetd ... * Stopping internet superserver inetd [ OK ] Removing squashfs-tools ... Processing triggers for man-db ... Processing triggers for ureadahead ... administrator@rauta-mint-turion ~ $ sudo aptitude purge ~c The following packages will be REMOVED: ltsp-server-standalone{p} nbd-server{p} openbsd-inetd{p} 0 packages upgraded, 0 newly installed, 3 to remove and 0 not upgraded. Need to get 0B of archives. After unpacking 0B will be used. Do you want to continue? [Y/n/?] (Reading database ... 157871 files and directories currently installed.) Removing ltsp-server-standalone ... Purging configuration files for ltsp-server-standalone ... Removing nbd-server ... Purging configuration files for nbd-server ... Removing openbsd-inetd ... Purging configuration files for openbsd-inetd ... Processing triggers for ureadahead ... administrator@rauta-mint-turion ~ $ sudo rm -rv /var/lib/t teamspeak-server/ tftpboot/ transmission-daemon/ administrator@rauta-mint-turion ~ $ sudo rm -rv /var/lib/tftpboot removed `/var/lib/tftpboot/ltsp/i386/pxelinux.cfg/default' removed directory: `/var/lib/tftpboot/ltsp/i386/pxelinux.cfg' removed directory: `/var/lib/tftpboot/ltsp/i386' removed directory: `/var/lib/tftpboot/ltsp' removed directory: `/var/lib/tftpboot' administrator@rauta-mint-turion ~ $ sudo find / -name "ltsp" /opt/ltsp administrator@rauta-mint-turion ~ $ sudo rm -rv /opt/ltsp removed directory: `/opt/ltsp/i386' removed directory: `/opt/ltsp' administrator@rauta-mint-turion ~ $ sudo aptitude install ltsp-server-standalone The following NEW packages will be installed: debconf-utils{a} debootstrap{a} ldm-server{a} ltsp-server{a} ltsp-server-standalone ltspfs{a} nbd-server{a} openbsd-inetd{a} squashfs-tools{a} The following packages are RECOMMENDED but will NOT be installed: dhcp3-server pulseaudio-esound-compat tftpd-hpa 0 packages upgraded, 9 newly installed, 0 to remove and 0 not upgraded. Need to get 0B/498kB of archives. After unpacking 2,437kB will be used. Do you want to continue? [Y/n/?] Preconfiguring packages ... Selecting previously deselected package openbsd-inetd. (Reading database ... 157868 files and directories currently installed.) Unpacking openbsd-inetd (from .../openbsd-inetd_0.20080125-4ubuntu2_i386.deb) ... Processing triggers for man-db ... GNU nano 2.2.4 New Buffer GNU nano 2.2.4 File: /etc/ltsp/dhcpd.conf # # Default LTSP dhcpd.conf config file. # authoritative; subnet 192.168.2.0 netmask 255.255.255.0 { range 192.168.2.70 192.168.2.79; option domain-name "jarvinen"; option domain-name-servers 192.168.2.254; option broadcast-address 192.168.2.255; option routers 192.168.2.254; # next-server 192.168.0.1; # get-lease-hostnames true; option subnet-mask 255.255.255.0; option root-path "/opt/ltsp/i386"; if substring( option vendor-class-identifier, 0, 9 ) = "PXEClient" { filename "/ltsp/i386/pxelinux.0"; } else { filename "/ltsp/i386/nbi.img"; } } [ Wrote 22 lines ] administrator@rauta-mint-turion ~ $ sudo service dhcp3-server start * Starting DHCP server dhcpd3 [ OK ] administrator@rauta-mint-turion ~ $ sudo ltsp-build-client /usr/share/ltsp/plugins/ltsp-build-client/common/010-chroot-tagging: line 3: /opt/ltsp/i386/etc/ltsp_chroot: No such file or directory error: LTSP client installation ended abnormally administrator@rauta-mint-turion ~ $ sudo cat /usr/share/ltsp/plugins/ltsp-build-client/common/010-chroot-tagging case "$MODE" in after-install) echo LTSP_CHROOT=$ROOT >> $ROOT/etc/ltsp_chroot ;; esac administrator@rauta-mint-turion ~ $ cd $ROOT/etc/ltsp_chroot bash: cd: /etc/ltsp_chroot: No such file or directory administrator@rauta-mint-turion ~ $ cd $ROOT/etc/ administrator@rauta-mint-turion /etc $ ls acpi chatscripts emacs group insserv.conf.d logrotate.conf mysql php5 rc6.d smbnetfs.conf UPower adduser.conf ConsoleKit environment group- iproute2 logrotate.d nanorc phpmyadmin rc.local snmp usb_modeswitch.conf alternatives console-setup esound grub.d issue lsb-base nbd-server pm rcS.d sound usb_modeswitch.d anacrontab cron.d firefox gshadow issue.net lsb-base-logging.sh ndiswrapper pnm2ppa.conf request-key.conf ssh ushare.conf apache2 cron.daily firefox-3.5 gshadow- java lsb-release netscsid.conf polkit-1 resolvconf ssl vga apm cron.hourly firestarter gtk-2.0 java-6-sun ltrace.conf network popularity-contest.conf resolv.conf sudoers vim apparmor cron.monthly fonts gtkmathview kbd ltsp NetworkManager ppp rmt sudoers.d vlc apparmor.d crontab foomatic gufw kernel lxdm networks printcap rpc su-to-rootrc vsftpd.conf apport cron.weekly fstab hal kernel-img.conf magic nsswitch.conf profile rsyslog.conf sweeprc w3m apt crypttab ftpusers hdparm.conf kerneloops.conf magic.mime ntp.conf profile.d rsyslog.d sysctl.conf wgetrc at.deny cups fuse.conf host.conf kompozer mailcap obex-data-server protocols samba sysctl.d wildmidi auto-apt dbconfig-common gai.conf hostname ldap mailcap.order ODBCDataSources psiconv sane.d teamspeak-server wodim.conf avahi dbus-1 gamin hosts ld.so.cache manpath.config odbc.ini pulse screenrc terminfo wpa_supplicant bash.bashrc debconf.conf gconf hosts.allow ld.so.conf menu openal purple securetty thunderbird X11 bash_completion debian_version gdb hosts.deny ld.so.conf.d menu-methods openoffice python security timezone xdg bash_completion.d default gdm hp legal mime.types opt python2.6 sensors3.conf transmission-daemon xml bindresvport.blacklist defoma ghostscript ifplugd lftp.conf mke2fs.conf pam.conf python2.7 sensors.d ts.conf xulrunner-1.9.2 blkid.conf deluser.conf gimp inetd.conf libpaper.d modprobe.d pam.d python3.1 services ucf.conf zsh_command_not_found blkid.tab depmod.d gnome init libreoffice modules pango rc0.d sgml udev bluetooth dhcp gnome-system-tools init.d linuxmint motd papersize rc1.d shadow ufw bonobo-activation dhcp3 gnome-vfs-2.0 initramfs-tools locale.alias mplayer passwd rc2.d shadow- updatedb.conf ca-certificates dictionaries-common gnome-vfs-mime-magic inputrc localtime mtab passwd- rc3.d shells update-manager ca-certificates.conf doc-base gre.d insserv logcheck mtab.fuselock pcmcia rc4.d skel update-motd.d calendar dpkg groff insserv.conf login.defs mtools.conf perl rc5.d smb2www update-notifier administrator@rauta-mint-turion /etc $ cd ltsp/ administrator@rauta-mint-turion /etc/ltsp $ ls dhcpd.conf ltsp-update-image.conf administrator@rauta-mint-turion /etc/ltsp $ cat dhcpd.conf # # Default LTSP dhcpd.conf config file. # authoritative; subnet 192.168.2.0 netmask 255.255.255.0 { range 192.168.2.70 192.168.2.79; option domain-name "jarvinen"; option domain-name-servers 192.168.2.254; option broadcast-address 192.168.2.255; option routers 192.168.2.254; # next-server 192.168.0.1; # get-lease-hostnames true; option subnet-mask 255.255.255.0; option root-path "/opt/ltsp/i386"; if substring( option vendor-class-identifier, 0, 9 ) = "PXEClient" { filename "/ltsp/i386/pxelinux.0"; } else { filename "/ltsp/i386/nbi.img"; } } administrator@rauta-mint-turion /etc/ltsp $

    Read the article

  • Problems with Update Manager

    - by user65965
    Whenever I try to update with update manager I get the following errors: W:Failed to fetch http://archive.ubuntu.com/ubuntu/dists/precise/Release Unable to find expected entry 'commercial/source/Sources' in Release file (Wrong sources.list entry or malformed file) W:Failed to fetch http://archive.ubuntu.com/ubuntu/dists/precise-updates/Release Unable to find expected entry 'commercial/source/Sources' in Release file (Wrong sources.list entry or malformed file) W:Failed to fetch http://archive.ubuntu.com/ubuntu/dists/precise-backports/Release Unable to find expected entry 'commercial/source/Sources' in Release file (Wrong sources.list entry or malformed file) W:Failed to fetch http://archive.ubuntu.com/ubuntu/dists/precise-security/Release Unable to find expected entry 'commercial/source/Sources' in Release file (Wrong sources.list entry or malformed file) W:Failed to fetch http://ppa.launchpad.net/iefremov/ppa/ubuntu/dists/precise/main/source/Sources 404 Not Found W:Failed to fetch http://ppa.launchpad.net/iefremov/ppa/ubuntu/dists/precise/main/binary-amd64/Packages 404 Not Found W:Failed to fetch http://ppa.launchpad.net/iefremov/ppa/ubuntu/dists/precise/main/binary-i386/Packages 404 Not Found E:Some index files failed to download. They have been ignored, or old ones used instead. Any help would be much appreciated, thank you. Thank you very much Eliah. I'm still pretty new to Ubuntu. Here's the output I got from the terminal: No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 12.04 LTS Release: 12.04 Codename: precise # See http://help.ubuntu.com/community/UpgradeNotes for how to upgrade to # newer versions of the distribution. deb http://archive.ubuntu.com/ubuntu precise main restricted commercial deb-src http://archive.ubuntu.com/ubuntu precise restricted main commercial multiverse universe #Added by software-properties ## Major bug fix updates produced after the final release of the ## distribution. deb http://archive.ubuntu.com/ubuntu precise-updates main restricted commercial deb-src http://archive.ubuntu.com/ubuntu precise-updates restricted main commercial multiverse universe #Added by software-properties ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team. Also, please note that software in universe WILL NOT receive any ## review or updates from the Ubuntu security team. deb http://archive.ubuntu.com/ubuntu precise universe deb http://archive.ubuntu.com/ubuntu precise-updates universe ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## multiverse WILL NOT receive any review or updates from the Ubuntu ## security team. deb http://archive.ubuntu.com/ubuntu precise multiverse deb http://archive.ubuntu.com/ubuntu precise-updates multiverse ## N.B. software from this repository may not have been tested as ## extensively as that contained in the main release, although it includes ## newer versions of some applications which may provide useful features. ## Also, please note that software in backports WILL NOT receive any review ## or updates from the Ubuntu security team. deb http://archive.ubuntu.com/ubuntu precise-backports main restricted universe multiverse commercial deb-src http://archive.ubuntu.com/ubuntu precise-backports main restricted universe multiverse commercial #Added by software-properties deb http://archive.ubuntu.com/ubuntu precise-security main restricted commercial deb-src http://archive.ubuntu.com/ubuntu precise-security restricted main commercial multiverse universe #Added by software-properties deb http://archive.ubuntu.com/ubuntu precise-security universe deb http://archive.ubuntu.com/ubuntu precise-security multiverse ## Uncomment the following two lines to add software from Canonical's ## 'partner' repository. ## This software is not part of Ubuntu, but is offered by Canonical and the ## respective vendors as a service to Ubuntu users. deb http://archive.canonical.com/ubuntu oneiric partner deb-src http://archive.canonical.com/ubuntu precise partner ## This software is not part of Ubuntu, but is offered by third-party ## developers who want to ship their latest software. deb http://extras.ubuntu.com/ubuntu precise main deb-src http://extras.ubuntu.com/ubuntu precise main ## This is a 3rd party script to install and update Oracle Java deb http://www.duinsoft.nl/pkg debs all ## Sun-Java6-JRE deb http://security.ubuntu.com/ubuntu hardy-security main multiverse ** /etc/apt/sources.list.d/askubuntu-tools-ppa-precise.list: deb http://ppa.launchpad.net/askubuntu-tools/ppa/ubuntu precise main deb-src http://ppa.launchpad.net/askubuntu-tools/ppa/ubuntu precise main ** /etc/apt/sources.list.d/askubuntu-tools-ppa-precise.list.save: deb http://ppa.launchpad.net/askubuntu-tools/ppa/ubuntu precise main deb-src http://ppa.launchpad.net/askubuntu-tools/ppa/ubuntu precise main ** /etc/apt/sources.list.d/effie-jayx-turpial-oneiric.list: deb http://ppa.launchpad.net/effie-jayx/turpial/ubuntu precise main # disabled on upgrade to precise deb-src http://ppa.launchpad.net/effie-jayx/turpial/ubuntu precise main # disabled on upgrade to precise ** /etc/apt/sources.list.d/effie-jayx-turpial-oneiric.list.distUpgrade: deb http://ppa.launchpad.net/effie-jayx/turpial/ubuntu oneiric main deb-src http://ppa.launchpad.net/effie-jayx/turpial/ubuntu oneiric main ** /etc/apt/sources.list.d/effie-jayx-turpial-oneiric.list.save: deb http://ppa.launchpad.net/effie-jayx/turpial/ubuntu precise main # disabled on upgrade to precise deb-src http://ppa.launchpad.net/effie-jayx/turpial/ubuntu precise main # disabled on upgrade to precise ** /etc/apt/sources.list.d/getdeb.list: # deb http://archive.getdeb.net/ubuntu oneiric-getdeb apps # disabled on upgrade to precise ** /etc/apt/sources.list.d/getdeb.list.distUpgrade: deb http://archive.getdeb.net/ubuntu oneiric-getdeb apps ** /etc/apt/sources.list.d/getdeb.list.save: # deb http://archive.getdeb.net/ubuntu oneiric-getdeb apps # disabled on upgrade to precise ** /etc/apt/sources.list.d/hotot-team-ppa-oneiric.list: deb http://ppa.launchpad.net/hotot-team/ppa/ubuntu precise main # disabled on upgrade to precise deb-src http://ppa.launchpad.net/hotot-team/ppa/ubuntu precise main # disabled on upgrade to precise ** /etc/apt/sources.list.d/hotot-team-ppa-oneiric.list.distUpgrade: deb http://ppa.launchpad.net/hotot-team/ppa/ubuntu oneiric main deb-src http://ppa.launchpad.net/hotot-team/ppa/ubuntu oneiric main ** /etc/apt/sources.list.d/hotot-team-ppa-oneiric.list.save: deb http://ppa.launchpad.net/hotot-team/ppa/ubuntu precise main # disabled on upgrade to precise deb-src http://ppa.launchpad.net/hotot-team/ppa/ubuntu precise main # disabled on upgrade to precise ** /etc/apt/sources.list.d/iefremov-ppa-precise.list: deb http://ppa.launchpad.net/iefremov/ppa/ubuntu precise main deb-src http://ppa.launchpad.net/iefremov/ppa/ubuntu precise main ** /etc/apt/sources.list.d/iefremov-ppa-precise.list.save: deb http://ppa.launchpad.net/iefremov/ppa/ubuntu precise main deb-src http://ppa.launchpad.net/iefremov/ppa/ubuntu precise main ** /etc/apt/sources.list.d/jockey.list: deb http://www.openprinting.org/download/printdriver/debian/ lsb3.2 main-nonfree # disabled on upgrade to precise ** /etc/apt/sources.list.d/jockey.list.distUpgrade: deb http://www.openprinting.org/download/printdriver/debian/ lsb3.2 main-nonfree ** /etc/apt/sources.list.d/jockey.list.save: deb http://www.openprinting.org/download/printdriver/debian/ lsb3.2 main-nonfree # disabled on upgrade to precise ** /etc/apt/sources.list.d/plexydesk-plexydesk-dailybuild-precise.list: deb http://ppa.launchpad.net/plexydesk/plexydesk-dailybuild/ubuntu precise main deb-src http://ppa.launchpad.net/plexydesk/plexydesk-dailybuild/ubuntu precise main ** /etc/apt/sources.list.d/plexydesk-plexydesk-dailybuild-precise.list.save: deb http://ppa.launchpad.net/plexydesk/plexydesk-dailybuild/ubuntu precise main deb-src http://ppa.launchpad.net/plexydesk/plexydesk-dailybuild/ubuntu precise main ** /etc/apt/sources.list.d/precise-partner.list: deb http://archive.canonical.com/ubuntu precise partner #Added by software-center ** /etc/apt/sources.list.d/precise-partner.list.save: deb http://archive.canonical.com/ubuntu precise partner #Added by software-center ** /etc/apt/sources.list.d/private-ppa.launchpad.net_commercial-ppa-uploaders_crossover-pro_ubuntu.list: # deb https://justin-dormandy:[email protected]/commercial-ppa-uploaders/crossover-pro/ubuntu precise main #Added by software-center disabled on upgrade to precise ** /etc/apt/sources.list.d/private-ppa.launchpad.net_commercial-ppa-uploaders_crossover-pro_ubuntu.list.distUpgrade: cat: /etc/apt/sources.list.d/private-ppa.launchpad.net_commercial-ppa-uploaders_crossover-pro_ubuntu.list.distUpgrade: Permission denied ** /etc/apt/sources.list.d/private-ppa.launchpad.net_commercial-ppa-uploaders_crossover-pro_ubuntu.list.save: cat: /etc/apt/sources.list.d/private-ppa.launchpad.net_commercial-ppa-uploaders_crossover-pro_ubuntu.list.save: Permission denied ** /etc/apt/sources.list.d/screenlets-ppa-precise.list: deb http://ppa.launchpad.net/screenlets/ppa/ubuntu precise main deb-src http://ppa.launchpad.net/screenlets/ppa/ubuntu precise main ** /etc/apt/sources.list.d/screenlets-ppa-precise.list.save: deb http://ppa.launchpad.net/screenlets/ppa/ubuntu precise main deb-src http://ppa.launchpad.net/screenlets/ppa/ubuntu precise main ** /etc/apt/sources.list.d/webupd8team-java-precise.list: deb http://ppa.launchpad.net/webupd8team/java/ubuntu precise main deb-src http://ppa.launchpad.net/webupd8team/java/ubuntu precise main ** /etc/apt/sources.list.d/webupd8team-java-precise.list.save: deb http://ppa.launchpad.net/webupd8team/java/ubuntu precise main deb-src http://ppa.launchpad.net/webupd8team/java/ubuntu precise main

    Read the article

  • Are you cashing in on the MVP complimentary subscriptions ?

    - by Tarun Arora
    The two most asked questions in the Microsoft technology communities around the Microsoft MVP program are, 1. How do I become a Microsoft MVP? 2. What benefits do I get as an MVP? The answer to the first question has been well answered here. In this blog post, I’ll try and answer the second question.           Please find a comprehensive list of Not for Resale personal subscriptions of various products that Microsoft MVP’s are eligible for Product Description Details JetBrains Resharper, dotTrace, dotCover & WebStorm  https://www.jetbrains.com/resharper/buy/mvp.html RedGate Sql server development, database administration, .net development, azure development (merged with Cerebrata), mySQL development, Oracle development http://www.red-gate.com/community/mvp-program Pluralsight Pluralsight on demand training http://blog.pluralsight.com/2011/02/28/pluralsight-for-mvp/ Cerebrata Cloud storage studio and Azure Diagnostic Manager (part of redgate now) https://www.cerebrata.com/Offers/mvp.aspx Telerik Telerik Ultimate collection & Telerik TeamPulse http://blogs.telerik.com/blogs/posts/11-03-01/telerik-gift-for-microsoft-mvps.aspx Developer Express DevEx controls http://www.devexpress.com/Home/Community/mvp.xml InnerWorking 600 hours of .net training catalogue http://www.innerworkings.com/mvp Typemock Typemock Isolator, Typemock Isolator for Sharepoint developers, Typemock Isolator for web developers, TestDriven.NET http://www.typemock.com/mvp SpeakFlow A suite of tools for creating, managing, and delivering non-linear presentations http://www.speakflow.com/ TechSmith Camtasia Studio, SnagIt, screen cast http://www.techsmith.com/camtasia.html Altova Altova XML spy http://www.altova.com/xml-editor/ Visual SVN VisualSVN Subversion integration plug-in for Visual Studio http://www.visualsvn.com/visualsvn/purchase/mvp/ PreEmptive Solution Professional PreEmptive Analytics, Dotfuscator http://www.preemptive.com/landing/mvp Armadillo Armadillo Adaptive Bug Prevention http://www.armadilloverdrive.com/ IS Decisions NFR license to Userlock, RemoteExec, FileAudit & WinReporter http://www.isdecisions.com/download/mvp-mct-program.htm Idera SQL tools http://www.idera.com/Content/Home.aspx West Wind Help Builder Help builder solution http://www.west-wind.com/weblog/posts/2005/Mar/09/Are-you-a-Microsoft-MVP-Get-a-FREE-copy-of-West-Wind-Html-Help-Builder Bamboo Sharepoint tools http://community.bamboosolutions.com/blogs/partner-advantage-program/archive/2008/08/01/partner-advantage-program-mvp.aspx Nitriq Nitriq code analysis http://blog.nitriq.com/FreeLicensesForMicrosoftMVPs.aspx ByteScout Components, Libraries and Developer Tools http://bytescout.com/buy/purchase_nfr_for_mvp.html YourKit Java and .net Profiler http://yourkit.com/.net/profiler/index.jsp Aspose .NET components http://www.aspose.com/corporate/community/2012_05_08_nfr-licenses-for-community-leaders.aspx Apart from google bing fu; stackoverflow and breathtech were a great help in compiling the above list. If you know of any other benefits, offers or complimentary subscriptions on offer for MVPs not cover in the list above, please add to the comment thread and I’ll have it updated in the list. Enjoy

    Read the article

  • Should each page of a Blog listing have its own Title

    - by RandomBen
    Should example.com/Blog?Page=1, example.com/Blog?Page=2, etc have the same title? I have done some research on this and SEOMoz's tools say I have duplicate titles and so does Google's Webmaster tools. If you look at top end examples like http://www.seobook.com/blog and http://www.seomoz.org/blog they both use the same title across all query of their ?Page=X URLs. So what is the better choice or does it even matter?

    Read the article

  • SQL SERVER – 2012 – All Download Links in Single Page – SQL Server 2012

    - by pinaldave
    SQL Server 2012 RTM is just announced and recently I wrote about all the SQL Server 2012 Certification on single page. As a feedback, I received suggestions to have a single page where everything about SQL Server 2012 is listed. I will keep this page updated as new updates are announced. Microsoft SQL Server 2012 Evaluation Microsoft SQL Server 2012 enables a cloud-ready information platform that will help organizations unlock breakthrough insights across the organization. Microsoft SQL Server 2012 Express Microsoft SQL Server 2012 Express is a powerful and reliable free data management system that delivers a rich and reliable data store for lightweight Web Sites and desktop applications. Microsoft SQL Server 2012 Feature Pack The Microsoft SQL Server 2012 Feature Pack is a collection of stand-alone packages which provide additional value for Microsoft SQL Server 2012. Microsoft SQL Server 2012 Report Builder Report Builder provides a productive report-authoring environment for IT professionals and power users. It supports the full capabilities of SQL Server 2012 Reporting Services. Microsoft SQL Server 2012 Master Data Services Add-in For Microsoft Excel The Master Data Services Add-in for Excel gives multiple users the ability to update master data in a familiar tool without compromising the data’s integrity in Master Data Services. Microsoft SQL Server 2012 Performance Dashboard Reports The SQL Server 2012 Performance Dashboard Reports are Reporting Services report files designed to be used with the Custom Reports feature of SQL Server Management Studio. Microsoft SQL Server 2012 PowerPivot for Microsoft Excel® 2010 Microsoft PowerPivot for Microsoft Excel 2010 provides ground-breaking technology; fast manipulation of large data sets, streamlined integration of data, and the ability to effortlessly share your analysis through Microsoft SharePoint. Microsoft SQL Server 2012 Reporting Services Add-in for Microsoft SharePoint Technologies 2010 The SQL Server 2012 Reporting Services Add-in for Microsoft SharePoint 2010 technologies allows you to integrate your reporting environment with the collaborative SharePoint 2010 experience. Microsoft SQL Server 2012 Semantic Language Statistics The Semantic Language Statistics Database is a required component for the Statistical Semantic Search feature in Microsoft SQL Server 2012 Semantic Language Statistics. Microsoft ®SQL Server 2012 FileStream Driver – Windows Logo Certification Catalog file for Microsoft SQL Server 2012 FileStream Driver that is certified for WindowsServer 2008 R2. It meets Microsoft standards for compatibility and recommended practices with the Windows Server 2008 R2 operating systems. Microsoft SQL Server StreamInsight 2.0 Microsoft StreamInsight is Microsoft’s Complex Event Processing technology to help businesses create event-driven applications and derive better insights by correlating event streams from multiple sources with near-zero latency. Microsoft JDBC Driver 4.0 for SQL Server Download the Microsoft JDBC Driver 4.0 for SQL Server, a Type 4 JDBC driver that provides database connectivity through the standard JDBC application program interfaces (APIs) available in Java Platform, Enterprise Edition 5 and 6. Data Quality Services Performance Best Practices Guide This guide focuses on a set of best practices for optimizing performance of Data Quality Services (DQS). Microsoft Drivers 3.0 for SQL Server for PHP The Microsoft Drivers 3.0 for SQL Server for PHP provide connectivity to Microsoft SQLServer from PHP applications. Product Documentation for Microsoft SQL Server 2012 for firewall and proxy restricted environments The Microsoft SQL Server 2012 setup installs only the Help Viewer…install any documentation. All of the SQL Server documentation is available online. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Installing ASP.NET MVC 2 RTM on Visual Studio 2010 RC

    - by shiju
    Visual Studio 2010 RC is built against the ASP.NET MVC 2 RC version but you easily install ASP.NET MVC 2 RTM on the Visual Studio 2010 RC. For installing ASP.NET MVC 2 RTM, do the following steps 1) Uninstall "ASP.NET MVC 2 ". 2) Uninstall "Microsoft ASP.NET MVC 2 – Visual Studio 2008 Tools". 3) Install the new ASP.NET MVC 2 RTM version for Visual Studio 2008 SP1. The above steps will enable you to use ASP.NET MVC 2 RTM version on the Visual Studio 2010 RC. Note : Don't uninstall Microsoft ASP.NET MVC 2 – Visual Studio 2010 Tools

    Read the article

  • SQL Server Management Data Warehouse - quick tour on setting health monitoring policies

    - by ssqa.net
    Profiler, Perfmon, DMVs & scripts are legendary tools for a DBA to monitor the SQL arena. In line with these tools SQL Server 2008 throws a powerful stream with policy based management (PBM) framework & management data warehouse (MDW) methods, which is a relational database that contains the data that is collected from a server that is a data collection target. This data is used to generate the reports for the System Data collection sets, and can also be used to create custom reports. .....(read more)

    Read the article

  • Understanding EDI 997.

    - by VishnuTiwariBlog
    Hi Guys, This is for the EDI starter. Below is the complete detail of EDI 997 segment and element details. 997 Functional Acknowledgment Transaction Layout: No. Seg ID Name Description Example M/O 010 ST Transaction Set Header To indicate the start of a transaction set and to assign a control number ST*997*382823~   M ST01   Code uniquely identifying a Transaction Set   M ST02   Identifying control number that must be unique within the transaction set functional group assigned by the originator for a transaction set   M 020 AK1 Functional Group Response Header To start acknowledgment of a functional group AK1*QM*2459823 M        AK101   Code identifying a group of application related transaction sets IN Invoice Information (810) SH Ship Notice/Manifest (856)     AK102   Assigned number originated and maintained by the sender     030 AK2 Transaction Set Response Header To start acknowledgment of a single transaction set AK2*856*001 M AK201   Code uniquely identifying a Transaction Set 810 Invoice 856 Ship Notice/Manifest   M AK202   Identifying control number that must be unique within the transaction set functional group assigned by the originator for a transaction set   M 040 AK3 Data Segment Note To report errors in a data segment and identify the location of the data segment AK3*TD3*9 O AK301 Segment ID Code Code defining the segment ID of the data segment in error (See Appendix A - Number 77)     AK302 Segment Position in Transaction Set The numerical count position of this data segment from the start of the transaction set: the transaction set header is count position 1     050 AK4 Data Element Note To report errors in a data element or composite data structure and identify the location of the data element AK4*2**2 O AK401 Position in Segment Code indicating the relative position of a simple data element, or the relative position of a composite data structure combined with the relative position of the component data element within the composite data structure, in error; the count starts with 1 for the simple data element or composite data structure immediately following the segment ID     AK402 Element Position in Segment This is used to indicate the relative position of a simple data element, or the relative position of a composite data structure with the relative position of the component within the composite data structure, in error; in the data segment the count starts with 1 for the simple data element or composite data structure immediately following the segment ID     AK403 Data Element Syntax Error Code Code indicating the error found after syntax edits of a data element 1 Mandatory Data Element Missing 2 Conditional Required Data Element Missing 3 Too Many Data Elements 4 Data Element Too Short 5 Data Element Too Long 6 Invalid Character in Data Element 7 Invalid Code Value 8 Invalid Date 9 Invalid Time 10 Exclusion Condition Violated     AK404 Copy of Bad Data Element This is a copy of the data element in error     060 AK5 AK5 Transaction Set Response Trailer To acknowledge acceptance or rejection and report errors in a transaction set AK5*A~ AK5*R*5~ M AK501 Transaction Set Acknowledgment Code Code indicating accept or reject condition based on the syntax editing of the transaction set A Accepted E Accepted But Errors Were Noted R Rejected     AK502 Transaction Set Syntax Error Code Code indicating error found based on the syntax editing of a transaction set 1 Transaction Set Not Supported 2 Transaction Set Trailer Missing 3 Transaction Set Control Number in Header and Trailer Do Not Match 4 Number of Included Segments Does Not Match Actual Count 5 One or More Segments in Error 6 Missing or Invalid Transaction Set Identifier 7 Missing or Invalid Transaction Set Control Number     070 AK9 Functional Group Response Trailer To acknowledge acceptance or rejection of a functional group and report the number of included transaction sets from the original trailer, the accepted sets, and the received sets in this functional group AK9*A*1*1*1~ AK9*R*1*1*0~ M AK901 Functional Group Acknowledge Code Code indicating accept or reject condition based on the syntax editing of the functional group A Accepted E Accepted, But Errors Were Noted. R Rejected     AK902 Number of Transaction Sets Included Total number of transaction sets included in the functional group or interchange (transmission) group terminated by the trailer containing this data element     AK903 Number of Received Transaction Sets Number of Transaction Sets received     AK904 Number of Accepted Transaction Sets Number of accepted Transaction Sets in a Functional Group     AK905 Functional Group Syntax Error Code Code indicating error found based on the syntax editing of the functional group header and/or trailer 1 Functional Group Not Supported 2 Functional Group Version Not Supported 3 Functional Group Trailer Missing 4 Group Control Number in the Functional Group Header and Trailer Do Not Agree 5 Number of Included Transaction Sets Does Not Match Actual Count 6 Group Control Number Violates Syntax     080 SE Transaction Set Trailer To indicate the end of the transaction set and provide the count of the transmitted segments (including the beginning (ST) and ending (SE) segments) SE*9*223~ M SE01 Number of Included Segments Total number of segments included in a transaction set including ST and SE segments     SE02 Transaction Set Control Number Identifying control number that must be unique within the transaction set functional group assigned by the originator for a transaction set

    Read the article

  • Understanding EDI 997

    - by VishnuTiwariBlog
    Hi Guys, This is for the EDI starter. Below is the complete detail of EDI 997 segment and element details. 997 Functional Acknowledgment Transaction Layout:   No. Seg ID Name Description Example M/O 010 ST Transaction Set Header To indicate the start of a transaction set and to assign a control number ST*997*382823~   M ST01   Code uniquely identifying a Transaction Set   M ST02   Identifying control number that must be unique within the transaction set functional group assigned by the originator for a transaction set   M 020 AK1 Functional Group Response Header To start acknowledgment of a functional group AK1*QM*2459823 M        AK101   Code identifying a group of application related transaction sets IN Invoice Information (810) SH Ship Notice/Manifest (856)     AK102   Assigned number originated and maintained by the sender     030 AK2 Transaction Set Response Header To start acknowledgment of a single transaction set AK2*856*001 M AK201   Code uniquely identifying a Transaction Set 810 Invoice 856 Ship Notice/Manifest   M AK202   Identifying control number that must be unique within the transaction set functional group assigned by the originator for a transaction set   M 040 AK3 Data Segment Note To report errors in a data segment and identify the location of the data segment AK3*TD3*9 O AK301 Segment ID Code Code defining the segment ID of the data segment in error (See Appendix A - Number 77)     AK302 Segment Position in Transaction Set The numerical count position of this data segment from the start of the transaction set: the transaction set header is count position 1     050 AK4 Data Element Note To report errors in a data element or composite data structure and identify the location of the data element AK4*2**2 O AK401 Position in Segment Code indicating the relative position of a simple data element, or the relative position of a composite data structure combined with the relative position of the component data element within the composite data structure, in error; the count starts with 1 for the simple data element or composite data structure immediately following the segment ID     AK402 Element Position in Segment This is used to indicate the relative position of a simple data element, or the relative position of a composite data structure with the relative position of the component within the composite data structure, in error; in the data segment the count starts with 1 for the simple data element or composite data structure immediately following the segment ID     AK403 Data Element Syntax Error Code Code indicating the error found after syntax edits of a data element 1 Mandatory Data Element Missing 2 Conditional Required Data Element Missing 3 Too Many Data Elements 4 Data Element Too Short 5 Data Element Too Long 6 Invalid Character in Data Element 7 Invalid Code Value 8 Invalid Date 9 Invalid Time 10 Exclusion Condition Violated     AK404 Copy of Bad Data Element This is a copy of the data element in error     060 AK5 AK5 Transaction Set Response Trailer To acknowledge acceptance or rejection and report errors in a transaction set AK5*A~ AK5*R*5~ M AK501 Transaction Set Acknowledgment Code Code indicating accept or reject condition based on the syntax editing of the transaction set A Accepted E Accepted But Errors Were Noted R Rejected     AK502 Transaction Set Syntax Error Code Code indicating error found based on the syntax editing of a transaction set 1 Transaction Set Not Supported 2 Transaction Set Trailer Missing 3 Transaction Set Control Number in Header and Trailer Do Not Match 4 Number of Included Segments Does Not Match Actual Count 5 One or More Segments in Error 6 Missing or Invalid Transaction Set Identifier 7 Missing or Invalid Transaction Set Control Number     070 AK9 Functional Group Response Trailer To acknowledge acceptance or rejection of a functional group and report the number of included transaction sets from the original trailer, the accepted sets, and the received sets in this functional group AK9*A*1*1*1~ AK9*R*1*1*0~ M AK901 Functional Group Acknowledge Code Code indicating accept or reject condition based on the syntax editing of the functional group A Accepted E Accepted, But Errors Were Noted. R Rejected     AK902 Number of Transaction Sets Included Total number of transaction sets included in the functional group or interchange (transmission) group terminated by the trailer containing this data element     AK903 Number of Received Transaction Sets Number of Transaction Sets received     AK904 Number of Accepted Transaction Sets Number of accepted Transaction Sets in a Functional Group     AK905 Functional Group Syntax Error Code Code indicating error found based on the syntax editing of the functional group header and/or trailer 1 Functional Group Not Supported 2 Functional Group Version Not Supported 3 Functional Group Trailer Missing 4 Group Control Number in the Functional Group Header and Trailer Do Not Agree 5 Number of Included Transaction Sets Does Not Match Actual Count 6 Group Control Number Violates Syntax     080 SE Transaction Set Trailer To indicate the end of the transaction set and provide the count of the transmitted segments (including the beginning (ST) and ending (SE) segments) SE*9*223~ M SE01 Number of Included Segments Total number of segments included in a transaction set including ST and SE segments     SE02 Transaction Set Control Number Identifying control number that must be unique within the transaction set functional group assigned by the originator for a transaction set

    Read the article

  • Web Platform Installer bundles for Visual Studio 2010 SP1 - and how you can build your own WebPI bundles

    - by Jon Galloway
    Visual Studio SP1 is  now available via the Web Platform Installer, which means you've got three options: Download the 1.5 GB ISO image Run the 750KB Web Installer (which figures out what you need to download) Install via Web PI Note: I covered some tips for installing VS2010 SP1 last week - including some that apply to all of these, such as removing options you don't use prior to installing the service pack to decrease the installation time and download size. Two Visual Studio 2010 SP1 Web PI packages There are actually two WebPI packages for VS2010 SP1. There's the standard Visual Studio 2010 SP1 package [Web PI link], which includes (quoting ScottGu's post): VS2010 2010 SP1 ASP.NET MVC 3 (runtime + tools support) IIS 7.5 Express SQL Server Compact Edition 4.0 (runtime + tools support) Web Deployment 2.0 The notes on that package sum it up pretty well: Looking for the latest everything? Look no further. This will get you Visual Studio 2010 Service Pack 1 and the RTM releases of ASP.NET MVC 3, IIS 7.5 Express, SQL Server Compact 4.0 with tooling, and Web Deploy 2.0. It's the value meal of Microsoft products. Tell your friends! Note: This bundle includes the Visual Studio 2010 SP1 web installer, which will dynamically determine the appropriate service pack components to download and install. This is typically in the range of 200-500 MB and will take 30-60 minutes to install, depending on your machine configuration. There is also a Visual Studio 2010 SP1 Core package [Web PI link], which only includes only the SP without any of the other goodies (MVC3, IIS Express, etc.). If you're doing any web development, I'd highly recommend the main pack since it the other installs are small, simple installs, but if you're working in another space, you might want the core package. Installing via the Web Platform Installer I generally like to go with the Web PI when possible since it simplifies most software installations due to things like: Smart dependency management - installing apps or tools which have software dependencies will automatically figure out which dependencies you don't have and add them to the list (which you can review before install) Simultaneous download and install - if your install includes more than one package, it will automatically pull the dependencies first and begin installing them while downloading the others Lists the latest downloads - no need to search around, as they're all listed based on a live feed Includes open source applications - a lot of popular open source applications are included as well as Microsoft software and tools No worries about reinstallation - WebPI installations detect what you've got installed, so for instance if you've got MVC 3 installed you don't need to worry about the VS2010 SP1 package install messing anything up In addition to the links I included above, you can install the WebPI from http://www.microsoft.com/web/downloads/platform.aspx, and if you have Web PI installed you can just tap the Windows key and type "Web Platform" to bring it up in the Start search list. You'll see Visual Studio SP1 listed in the spotlight list as shown below. That's the standard package, which includes MVC 3 / IIS 7.5 Express / SQL Compact / Web Deploy. If you just want the core install, you can use the search box in the upper right corner, typing in "Visual Studio SP1" as shown. Core Install: Use Web PI or the Visual Studio Web Installer? I think the big advantage of using Web PI to install VS 2010 SP1 is that it includes the other new bits. If you're going to install the SP1 core, I don't think there's as much advantage to using Web PI, as the Web PI Core install just downloads the Visual Studio Web Installer anyways. I think Web PI makes it a little easier to find the download, but not a lot. The Visual Studio Web Installer checks dependencies, so there's no big advantage there. If you do happen to hit any problems installing Visual Studio SP1 via Web PI, I'd recommend running the Visual Studio Web Installer, then running the Web PI VS 2010 SP1 package to get all the other goodies. I talked to one person who hit some random snag, recommended that, and it worked out. Custom Web Platform Installer bundles You can create links that will launch the Web Platform Installer with a custom list of tools. You can see an example of this by clicking through on the install button at http://asp.net/downloads (cancelling the installation dialog). You'll see this in the address bar: http://www.microsoft.com/web/gallery/install.aspx?appsxml=&appid=MVC3;ASPNET;NETFramework4;SQLExpress;VWD Notice that the appid querystring parameter includes a semicolon delimited list, and you can make your own custom Web PI links with your own desired app list. I can think of a lot of cases where that would be handy: linking to a recommended software configuration from a software project or product, setting up a recommended / documented / supported install list for a software development team or IT shop, etc. For instance, here's a link that installs just VS2010 SP1 Core and the SQL CE tools: http://www.microsoft.com/web/gallery/install.aspx?appsxml=&appid=VS2010SP1Core;SQLCETools Note: If you've already got all or some of the products installed, the display will reflect that. On my dev box which has the full SP1 package, here's what the above link gives me: Here's another example - on a fresh box I created a link to install MVC 3 and the Web Farm Framework (http://www.microsoft.com/web/gallery/install.aspx?appsxml=&appid=MVC3;WebFarmFramework) and got the following items added to the cart: But where do I get the App ID's? Aha, that's the trick. You can link to a list of cool packages, but you need to know the App ID's to link to them. To figure that out, I turned on tracing in Web Platform Installer  (also handy if you're ever having trouble with a WebPI install) and from the trace logs saw that the list of packages is pulled from an XML file: DownloadManager Information: 0 : Loading product xml from: https://go.microsoft.com/?linkid=9763242 DownloadManager Verbose: 0 : Connecting to https://go.microsoft.com/?linkid=9763242 with (partial) headers: Referer: wpi://2.1.0.0/Microsoft Windows NT 6.1.7601 Service Pack 1 If-Modified-Since: Wed, 09 Mar 2011 14:15:27 GMT User-Agent:Platform-Installer/3.0.3.0(Microsoft Windows NT 6.1.7601 Service Pack 1) DownloadManager Information: 0 : https://go.microsoft.com/?linkid=9763242 responded with 302 DownloadManager Information: 0 : Response headers: HTTP/1.1 302 Found Cache-Control: private Content-Length: 175 Content-Type: text/html; charset=utf-8 Expires: Wed, 09 Mar 2011 22:52:28 GMT Location: https://www.microsoft.com/web/webpi/3.0/webproductlist.xml Server: Microsoft-IIS/7.5 X-AspNet-Version: 2.0.50727 X-Powered-By: ASP.NET Date: Wed, 09 Mar 2011 22:53:27 GMT Browsing to https://www.microsoft.com/web/webpi/3.0/webproductlist.xml shows the full list. You can search through that in your browser / text editor if you'd like, open it in Excel as an XML table, etc. Here's a list of the App ID's as of today: SMO SMO32 PHP52ForIISExpress PHP53ForIISExpress StaticContent DefaultDocument DirectoryBrowse HTTPErrors HTTPRedirection ASPNET NETExtensibility ASP CGI ISAPIExtensions ISAPIFilters ServerSideIncludes HTTPLogging LoggingTools RequestMonitor Tracing CustomLogging ODBCLogging BasicAuthentication WindowsAuthentication DigestAuthentication ClientCertificateMappingAuthentication IISClientCertificateMappingAuthentication URLAuthorization RequestFiltering IPSecurity StaticContentCompression DynamicContentCompression IISManagementConsole IISManagementScriptsAndTools ManagementService MetabaseAndIIS6Compatibility WASProcessModel WASNetFxEnvironment WASConfigurationAPI IIS6WPICompatibility IIS6ScriptingTools IIS6ManagementConsole LegacyFTPServer FTPServer WebDAV LegacyFTPManagementConsole FTPExtensibility AdminPack AdvancedLogging WebFarmFrameworkNonLoc ExternalCacheNonLoc WebFarmFramework WebFarmFrameworkv2 WebFarmFrameworkv2_beta ExternalCache ECacheUpdate ARRv1 ARRv2Beta1 ARRv2Beta2 ARRv2RC ARRv2NonLoc ARRv2 ARRv2Update MVC MVCBeta MVCRC1 MVCRC2 DBManager DbManagerUpdate DynamicIPRestrictions DynamicIPRestrictionsUpdate DynamicIPRestrictionsLegacy DynamicIPRestrictionsBeta2 FTPOOB IISPowershellSnapin RemoteManager SEOToolkit VS2008RTM MySQL SQLDriverPHP52IIS SQLDriverPHP53IIS SQLDriverPHP52IISExpress SQLDriverPHP53IISExpress SQLExpress SQLManagementStudio SQLExpressAdv SQLExpressTools UrlRewrite UrlRewrite2 UrlRewrite2NonLoc UrlRewrite2RC UrlRewrite2Beta UrlRewrite10 UrlScan MVC3Installer MVC3 MVC3LocInstaller MVC3Loc MVC2 VWD VWD2010SP1Pack NETFramework4 WebMatrix WebMatrix_v1Refresh IISExpress IISExpress_v1 IIS7 AspWebPagesVS AspWebPagesVS_1_0 Plan9 Plan9Loc WebMatrix_WHP SQLCE SQLCETools SQLCEVSTools SQLCEVSTools_4_0 SQLCEVSToolsInstaller_4_0 SQLCEVSToolsInstallerNew_4_0 SQLCEVSToolsInstallerRepair_EN_4_0 SQLCEVSToolsInstallerRepair_JA_4_0 SQLCEVSToolsInstallerRepair_FR_4_0 SQLCEVSToolsInstallerRepair_DE_4_0 SQLCEVSToolsInstallerRepair_ES_4_0 SQLCEVSToolsInstallerRepair_IT_4_0 SQLCEVSToolsInstallerRepair_RU_4_0 SQLCEVSToolsInstallerRepair_KO_4_0 SQLCEVSToolsInstallerRepair_ZH_CN_4_0 SQLCEVSToolsInstallerRepair_ZH_TW_4_0 VWD2008 WebDAVOOB WDeploy WDeploy_v2 WDeployNoSMO WDeploy11 WinCache52 WinCache53 NETFramework35 WindowsImagingComponent VC9Redist NETFramework20SP2 WindowsInstaller31 PowerShell PowerShellMsu PowerShell2 WindowsInstaller45 FastCGIUpdate FastCGIBackport FastCGIIIS6 IIS51 IIS60 SQLNativeClient SQLNativeClient2008 SQLNativeClient2005 SQLCLRTypes SQLCLRTypes32 SMO_10_1 MySQLConnector PHP52 PHP53 PHPManager VSVWD2010Feature VWD2010WebFeature_0 VWD2010WebFeature_1 VWD2010WebFeature_2 VS2010SP1Prerequisite RIAServicesToolkitMay2010 Silverlight4Toolkit Silverlight4Tools VSLS SSMAMySQL WebsitePanel VS2010SP1Core VS2010SP1Installer VS2010SP1Pack MissingVWDOrVSVWD2010Feature VB2010Beta2Express VCS2010Beta2Express VC2010Beta2Express RIAServicesToolkitApr2010 VS2010Beta1 VS2010RC VS2010Beta2 VS2010Beta2Express VS2k8RTM VSCPP2k8RTM VSVB2k8RTM VSCS2k8RTM VSVWDFeature LegacyWinCache SQLExpress2005 SSMS2005

    Read the article

  • CRMIT’s HIGH VALUE CRM++ PLUGINS FOR CRM On DEMAND

    - by Soumo Das
    Customer satisfaction and experience being the two most considerable factors, these days businesses are on the lookout for automation tools that are world class, agile and keep quality at its core. CRMIT has developed such tools using cutting edge technologies and abstracting industry best practices and R&D.  Self Service Portal  With customers being so meticulous about regular updates and reliable access to their data, administrators just cannot think of walking a thin line. Surviving without a resource that provides a track of customer requirements for services available 24 x 7 can severely affect the productivity. In such a scenario, CRMIT’s Self Service Portal (SSP) is the best solution. This not only tracks the required customer data, but also allows companies to stay in tune with their employees, vendors and stakeholders.   One can directly sign up to become a CRMOD contact and SSP user. One need not use the database, as operations and interactions are d at run time. This is a fully configurable solution that tracks results periodically, thus making it easy for end users. It also offers better security and data visibility that enables users to progress smoothly. Quote and Order Management   When dealing with quotes, contracts and orders becomes complicated, only Quote & Order Management can work as a one-stop solution. CRMIT offers this great tool for managing all this information and for taking care of customer orders and service requirements.  This CRM On Demand plug-in allows one to create a new quote or copy the existing one. Products can be directly added from the product list of CRMOD and the pricing is calculated automatically. Quote can be generated and mailed to the external users in PDF, HTML and XLS formats. This not only allows management of quotes in an enhanced manner, but also supports various billing and tax calculation features that make work effortless.    Report Scheduler  When it comes to analyzing and providing statistics of various business processes currently running in an organization, one cannot depend on manual updates, which sometimes may be inaccurate or even delayed. CRMIT provides a SaaS based powerful solution - Report Scheduler - that allows CRM users to schedule reports as per the frequencies and then receive them as email attachments at the scheduled time.   With this powerful tool, administrators can control the report scheduler for assigning specific reports to specific users. After that, users can login and schedule any assigned report for viewing at particular intervals on monthly, weekly or daily basis. Additionally, users can also copy the mail to external users and can choose the preferred format. The best part is that sharing business data with third party become easy with this and for viewing reports, users need not log into their CRMOD account.  CRM On Demand Offline Solution CRM On-Demand Offline is another great CRM++ extension that allows one to work in both online and offline modes. Synchronizing both the modes is absolutely easy and offers ease while working. CRM OD offline works as an automation tool that not only improves efficiency, but also works as a backup in most cases. It is readily available as a windows application installer and requires users to be online only while validating and synchronizing. The best part is that working in the offline mode also works as a backup. 

    Read the article

  • Inside Red Gate - Divisions

    - by Simon Cooper
    When I joined Red Gate back in 2007, there were around 80 people in the company. Now, around 3 years later, it's grown to more than 200. It's a constant battle against Dunbar's number; the maximum number of people you can keep track of in a social group, to try and maintain that 'small company' feel that attracted myself and so many others to apply in the first place. There are several strategies the company's developed over the years to try and mitigate the effects of Dunbar's number. One of the main ones has been divisionalisation. Divisions The first division, .NET, appeared around the same time that I started in 2007. This combined the development, sales, marketing and management of the .NET tools (then, ANTS Profiler v3) into a separate section of the office. The idea was to increase the cohesion and communication between the different people involved in the entire lifecycle of the tools; from initial product development, through to marketing, then to customer support, who would feed back to the development team. This was such a success that the other development teams were re-worked around this model in 2009. Nowadays there are 4 divisions - SQL Tools, DBA, .NET, and New Business. Along the way there have been various tweaks to the details - the sales teams have been merged into the divisions, marketing and product support have been (mostly) centralised - but the same basic model remains. So, how has this helped? As Red Gate has continued to grow over the years, divisionalisation has turned Red Gate from a monolithic software company into what one person described as a 'federation of small businesses'. Each division is free to structure itself as it sees fit, it's free to decide what to concentrate development work on, organise its own newsletters and webinars, decide its own release schedule. Each division is its own small business. In terms of numbers, the size of each division varies from 20 people (.NET) to 52 (SQL Tools); well below Dunbar's number. From a developer's perspective, this means organisational structure is very flat & wide - there's only 2 layers between myself and the CEOs (not that it matters much; everyone can go and have a chat to Neil or Simon, or anyone else inbetween, whenever they want. Provided you can catch them at their desk!). As Red Gate grows, and expands into new areas, new divisions will be created as needed, old ones merged or disbanded, but the division structure will help to maintain that small-company feel that keeps Red Gate working as it does.

    Read the article

  • Getting UPK data into Excel

    - by maria.cozzolino(at)oracle.com
    Did you ever want someone to review your UPK outline outside of the Developer? You can send your outline to an Excel report, which can be distributed through email. Depending on how much additional data you want with your outline, there are two ways you can do this task. Basic data: • You can print a listing of all the items in the outline. • With your outline open, choose File/Print... • Choose the "Save document as" command on the right, and choose Excel (or xlsx). • HINT: If you have not expanded your entire outline, it's faster to use the commands in Developer to expand the entire outline. However, you can expand specific sections by clicking on them in the print preview. • NOTE: If you have the Details view displayed rather than the Player view, you can print all the data that appears in that view. Advanced data: If you desire a more detailed report, you can use the HP Quality Center publishing style, which also creates an Excel file. This style contains a default set of fields for use with Quality Center, but any of the metadata fields can be added to the report, and it can be used for more than just importing into HP Quality Center. To add additional columns to the HP Quality Center publishing style: 1. Make a copy of the publishing style. This process ensures that you have a good copy to revert to if something goes wrong with your customizations, and also allows you to keep your modifications when the software is upgraded. 2. Open the copy of the columnspec.xml file in your favorite XML editor - I use notepad. (This file is located in a language-specific folder in the HP Quality Center publishing style.) 3. Scroll down the columnspec file until you find the column to include. All the metadata fields that can be added to the report are listed in the columnspec file - you just need to tell the system to include the columns. 4. You will see a series of sections like this: 5. Change the value for "col export" to "yes". This will include the column in the Excel file. 6. If desired, change the value for "Play_ModesColHeader" to be whatever name you wish to appear in the Excel column heading. 7. Save the columnspec file. 8. Save the publishing style package. Now, when you publish for HP Quality Center, you will see your newly added columns. You can refer to the section on Customizing HP Quality Center Output in the Content Deployment Guide for additional customization details. Happy customization! I'd be interested in hearing what other uses you have for Excel reporting. Wishing you and yours a happy and healthy New Year! ~~Maria Cozzolino, Manager of Software Requirements and UI

    Read the article

< Previous Page | 155 156 157 158 159 160 161 162 163 164 165 166  | Next Page >