Search Results

Search found 45502 results on 1821 pages for 'open source contribution'.

Page 253/1821 | < Previous Page | 249 250 251 252 253 254 255 256 257 258 259 260  | Next Page >

  • SINGLE SIGN ON SECURITY THREAT! FACEBOOK access_token broadcast in the open/clear

    - by MOKANA
    Subsequent to my posting there was a remark made that this was not really a question but I thought I did indeed postulate one. So that there is no ambiquity here is the question with a lead in: Since there is no data sent from Facebook during the Canvas Load process that is not at some point divulged, including the access_token, session and other data that could uniquely identify a user, does any one see any other way other than adding one more layer, i.e., a password, sent over the wire via HTTPS along with the access_toekn, that will insure unique untampered with security by the user? Using Wireshark I captured the local broadcast while loading my Canvas Application page. I was hugely surprised to see the access_token broadcast in the open, viewable for any one to see. This access_token is appended to any https call to the Facebook OpenGraph API. Using facebook as a single click log on has now raised huge concerns for me. It is stored in a session object in memory and the cookie is cleared upon app termination and after reviewing the FB.Init calls I saw a lot of HTTPS calls so I assumed the access_token was always encrypted. But last night I saw in the status bar a call from what was simply an http call that included the App ID so I felt I should sniff the Application Canvas load sequence. Today I did sniff the broadcast and in the attached image you can see that there are http calls with the access_token being broadcast in the open and clear for anyone to gain access to. Am I missing something, is what I am seeing and my interpretation really correct. If any one can sniff and get the access_token they can theorically make calls to the Graph API via https, even though the call back would still need to be the site established in Facebook's application set up. But what is truly a security threat is anyone using the access_token for access to their own site. I do not see the value of a single sign on via Facebook if the only thing that was established as secure was the access_token - becuase for what I can see it clearly is not secure. Access tokens that never have an expire date do not change. Access_tokens are different for every user, to access to another site could be held tight to just a single user, but compromising even a single user's data is unacceptable. http://www.creatingstory.com/images/InTheOpen.png Went back and did more research on this: FINDINGS: Went back an re ran the canvas application to verify that it was not any of my code that was not broadcasting. In this call: HTTP GET /connect.php/en_US/js/CacheData HTTP/1.1 The USER ID is clearly visible in the cookie. So USER_ID's are fully visible, but they are already. Anyone can go to pretty much any ones page and hover over the image and see the USER ID. So no big threat. APP_ID are also easily obtainable - but . . . http://www.creatingstory.com/images/InTheOpen2.png The above file clearly shows the FULL ACCESS TOKEN clearly in the OPEN via a Facebook initiated call. Am I wrong. TELL ME I AM WRONG because I want to be wrong about this. I have since reset my app secret so I am showing the real sniff of the Canvas Page being loaded. Additional data 02/20/2011: @ifaour - I appreciate the time you took to compile your response. I am pretty familiar with the OAuth process and have a pretty solid understanding of the signed_request unpacking and utilization of the access_token. I perform a substantial amount of my processing on the server and my Facebook server side flows are all complete and function without any flaw that I know of. The application secret is secure and never passed to the front end application and is also changed regularly. I am being as fanatical about security as I can be, knowing there is so much I don’t know that could come back and bite me. Two huge access_token issues: The issues concern the possible utilization of the access_token from the USER AGENT (browser). During the FB.INIT() process of the Facebook JavaScript SDK, a cookie is created as well as an object in memory called a session object. This object, along with the cookie contain the access_token, session, a secret, and uid and status of the connection. The session object is structured such that is supports both the new OAuth and the legacy flows. With OAuth, the access_token and status are pretty much al that is used in the session object. The first issue is that the access_token is used to make HTTPS calls to the GRAPH API. If you had the access_token, you could do this from any browser: https://graph.facebook.com/220439?access_token=... and it will return a ton of information about the user. So any one with the access token can gain access to a Facebook account. You can also make additional calls to any info the user has granted access to the application tied to the access_token. At first I thought that a call into the GRAPH had to have a Callback to the URL established in the App Setup, but I tested it as mentioned below and it will return info back right into the browser. Adding that callback feature would be a good idea I think, tightens things up a bit. The second issue is utilization of some unique private secured data that identifies the user to the third party data base, i.e., like in my case, I would use a single sign on to populate user information into my database using this unique secured data item (i.e., access_token which contains the APP ID, the USER ID, and a hashed with secret sequence). None of this is a problem on the server side. You get a signed_request, you unpack it with secret, make HTTPS calls, get HTTPS responses back. When a user has information entered via the USER AGENT(browser) that must be stored via a POST, this unique secured data element would be sent via HTTPS such that they are validated prior to data base insertion. However, If there is NO secured piece of unique data that is supplied via the single sign on process, then there is no way to guarantee unauthorized access. The access_token is the one piece of data that is utilized by Facebook to make the HTTPS calls into the GRAPH API. it is considered unique in regards to BOTH the USER and the APPLICATION and is initially secure via the signed_request packaging. If however, it is subsequently transmitted in the clear and if I can sniff the wire and obtain the access_token, then I can pretend to be the application and gain the information they have authorized the application to see. I tried the above example from a Safari and IE browser and it returned all of my information to me in the browser. In conclusion, the access_token is part of the signed_request and that is how the application initially obtains it. After OAuth authentication and authorization, i.e., the USER has logged into Facebook and then runs your app, the access_token is stored as mentioned above and I have sniffed it such that I see it stored in a Cookie that is transmitted over the wire, resulting in there being NO UNIQUE SECURED IDENTIFIABLE piece of information that can be used to support interaction with the database, or in other words, unless there were one more piece of secure data sent along with the access_token to my database, i.e., a password, I would not be able to discern if it is a legitimate call. Luckily I utilized secure AJAX via POST and the call has to come from the same domain, but I am sure there is a way to hijack that. I am totally open to any ideas on this topic on how to uniquely identify my USERS other than adding another layer (password) via this single sign on process or if someone would just share with me that I read and analyzed my data incorrectly and that the access_token is always secure over the wire. Mahalo nui loa in advance.

    Read the article

  • Join using combined conditions on one join table

    - by Nathan Wienert
    I have join a table joining songs to genres. The table has a 'source' column that's used to identify where the genre was found. Genres are found from blogs, artists, tags, and posts. So, songs | song_genre | genres id | song_id, source, genre_id | id What I want to build is a song SELECT query that works something like this, given I already have a genre_id: IF exists song_genre with source='artist' AND a song_genre with source='blog' OR exists song_genre with source='artist' AND a song_genre with source='post' OR exists song_genre with source='tag' I'm was going to do it by doing a bunch of joins, but am sure I'm not doing it very well. Using Postgres 9.1.

    Read the article

  • Is the guideline: don't open email attachments or execute downloads or run plug-ins (Flash, Java) from untrusted sites enough to avert infection?

    - by therobyouknow
    I'd like to know if the following is enough to avert malware as I feel that the press and other advisory resources aren't always precisely clear on all the methods as to how PCs get infected. To my mind, the key step to getting infected is a conscious choice by the user to run an executable attachment from an email or download, but also viewing content that requires a plug-in (Flash, Java or something else). This conscious step breaks down into the following possibilities: don't open email attachments: certainly agree with this. But lets try to be clear: email comes in 2 parts -the text and the attachment. Just reading the email should not be risky, right? But opening (i.e. running) email attachments IS risky (malware can be present in the attachment) don't execute downloads (e.g. from sites linked from in suspect emails or otherwise): again certainly agree with this (malware can be present in the executable). Usually the user has to voluntary click to download, or at least click to run the executable. Question: has there ever been a case where a user has visited a site and a download has completed on its own and run on its own? don't run content requiring plug-ins: certainly agree: malware can be present in the executable. I vaguely recall cases with Flash but know of the Java-based vulnerabilities much better. Now, is the above enough? Note that I'm much more cautious than this. What I'm concerned about is that the media is not always very clear about how the malware infection occurs. They talk of "booby-trapped sites", "browser attacks" - HOW exactly? I'd presume the other threat would be malevolent use of Javascript to make an executable run on the user's machine. Would I be right and are there details I can read up on about this. Generally I like Javascript as a developer, please note. An accepted answer would fill in any holes I've missed here so we have a complete general view of what the threats are (even though the actual specific details of new threats vary, but the general vectors are known).

    Read the article

  • GLSL Error: failed to preprocess the source. How can I troubleshoot this?

    - by Brent Parker
    I'm trying to learn to play with OpenGL GLSL shaders. I've written a very simple program to simply create a shader and compile it. However, whenever I get to the compile step, I get the error: Error: Preprocessor error Error: failed to preprocess the source. Here's my very simple code: #include <GL/gl.h> #include <GL/glu.h> #include <GL/glut.h> #include <GL/glext.h> #include <time.h> #include <stdio.h> #include <iostream> #include <stdlib.h> using namespace std; const int screenWidth = 640; const int screenHeight = 480; const GLchar* gravity_shader[] = { "#version 140" "uniform float t;" "uniform mat4 MVP;" "in vec4 pos;" "in vec4 vel;" "const vec4 g = vec4(0.0, 0.0, -9.80, 0.0);" "void main() {" " vec4 position = pos;" " position += t*vel + t*t*g;" " gl_Position = MVP * position;" "}" }; double pointX = (double)screenWidth/2.0; double pointY = (double)screenWidth/2.0; void initShader() { GLuint shader = glCreateShader(GL_VERTEX_SHADER); glShaderSource(shader, 1, gravity_shader, NULL); glCompileShader(shader); GLint compiled = true; glGetShaderiv(shader, GL_COMPILE_STATUS, &compiled); if(!compiled) { GLint length; GLchar* log; glGetShaderiv(shader, GL_INFO_LOG_LENGTH, &length); log = (GLchar*)malloc(length); glGetShaderInfoLog(shader, length, &length, log); std::cout << log <<std::endl; free(log); } exit(0); } bool myInit() { initShader(); glClearColor(1.0f, 1.0f, 1.0f, 0.0f); glColor3f(0.0f, 0.0f, 0.0f); glPointSize(1.0); glLineWidth(1.0f); glMatrixMode(GL_PROJECTION); glLoadIdentity(); gluOrtho2D(0.0, (GLdouble) screenWidth, 0.0, (GLdouble) screenHeight); glEnable(GL_DEPTH_TEST); return true; } int main(int argc, char** argv) { glutInit(&argc, argv); glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB); glutInitWindowSize(screenWidth, screenHeight); glutInitWindowPosition(100, 150); glutCreateWindow("Mouse Interaction Display"); myInit(); glutMainLoop(); return 0; } Where am I going wrong? If it helps, I am trying to do this on a Acer Aspire One with an atom processor and integrated Intel video running the latest Ubuntu. It's not very powerful, but then again, this is a very simple shader. Thanks a lot for taking a look!

    Read the article

  • Observations in Migrating from JavaFX Script to JavaFX 2.0

    - by user12608080
    Observations in Migrating from JavaFX Script to JavaFX 2.0 Introduction Having been available for a few years now, there is a decent body of work written for JavaFX using the JavaFX Script language. With the general availability announcement of JavaFX 2.0 Beta, the natural question arises about converting the legacy code over to the new JavaFX 2.0 platform. This article reflects on some of the observations encountered while porting source code over from JavaFX Script to the new JavaFX API paradigm. The Application The program chosen for migration is an implementation of the Sudoku game and serves as a reference application for the book JavaFX – Developing Rich Internet Applications. The design of the program can be divided into two major components: (1) A user interface (ideally suited for JavaFX design) and (2) the puzzle generator. For the context of this article, our primary interest lies in the user interface. The puzzle generator code was lifted from a sourceforge.net project and is written entirely in Java. Regardless which version of the UI we choose (JavaFX Script vs. JavaFX 2.0), no code changes were required for the puzzle generator code. The original user interface for the JavaFX Sudoku application was written exclusively in JavaFX Script, and as such is a suitable candidate to convert over to the new JavaFX 2.0 model. However, a few notable points are worth mentioning about this program. First off, it was written in the JavaFX 1.1 timeframe, where certain capabilities of the JavaFX framework were as of yet unavailable. Citing two examples, this program creates many of its own UI controls from scratch because the built-in controls were yet to be introduced. In addition, layout of graphical nodes is done in a very manual manner, again because much of the automatic layout capabilities were in flux at the time. It is worth considering that this program was written at a time when most of us were just coming up to speed on this technology. One would think that having the opportunity to recreate this application anew, it would look a lot different from the current version. Comparing the Size of the Source Code An attempt was made to convert each of the original UI JavaFX Script source files (suffixed with .fx) over to a Java counterpart. Due to language feature differences, there are a small number of source files which only exist in one version or the other. The table below summarizes the size of each of the source files. JavaFX Script source file Number of Lines Number of Character JavaFX 2.0 Java source file Number of Lines Number of Characters ArrowKey.java 6 72 Board.fx 221 6831 Board.java 205 6508 BoardNode.fx 446 16054 BoardNode.java 723 29356 ChooseNumberNode.fx 168 5267 ChooseNumberNode.java 302 10235 CloseButtonNode.fx 115 3408 CloseButton.java 99 2883 ParentWithKeyTraversal.java 111 3276 FunctionPtr.java 6 80 Globals.java 20 554 Grouping.fx 8 140 HowToPlayNode.fx 121 3632 HowToPlayNode.java 136 4849 IconButtonNode.fx 196 5748 IconButtonNode.java 183 5865 Main.fx 98 3466 Main.java 64 2118 SliderNode.fx 288 10349 SliderNode.java 350 13048 Space.fx 78 1696 Space.java 106 2095 SpaceNode.fx 227 6703 SpaceNode.java 220 6861 TraversalHelper.fx 111 3095 Total 2,077 79,127 2531 87,800 A few notes about this table are in order: The number of lines in each file was determined by running the Unix ‘wc –l’ command over each file. The number of characters in each file was determined by running the Unix ‘ls –l’ command over each file. The examination of the code could certainly be much more rigorous. No standard formatting was performed on these files.  All comments however were deleted. There was a certain expectation that the new Java version would require more lines of code than the original JavaFX script version. As evidenced by a count of the total number of lines, the Java version has about 22% more lines than its FX Script counterpart. Furthermore, there was an additional expectation that the Java version would be more verbose in terms of the total number of characters.  In fact the preceding data shows that on average the Java source files contain fewer characters per line than the FX files.  But that's not the whole story.  Upon further examination, the FX Script source files had a disproportionate number of blank characters.  Why?  Because of the nature of how one develops JavaFX Script code.  The object literal dominates FX Script code.  Its not uncommon to see object literals indented halfway across the page, consuming lots of meaningless space characters. RAM consumption Not the most scientific analysis, memory usage for the application was examined on a Windows Vista system by running the Windows Task Manager and viewing how much memory was being consumed by the Sudoku version in question. Roughly speaking, the FX script version, after startup, had a RAM footprint of about 90MB and remained pretty much the same size. The Java version started out at about 55MB and maintained that size throughout its execution. What About Binding? Arguably, the most striking observation about the conversion from JavaFX Script to JavaFX 2.0 concerned the need for data synchronization, or lack thereof. In JavaFX Script, the primary means to synchronize data is via the bind expression (using the “bind” keyword), and perhaps to a lesser extent it’s “on replace” cousin. The bind keyword does not exist in Java, so for JavaFX 2.0 a Data Binding API has been introduced as a replacement. To give a feel for the difference between the two versions of the Sudoku program, the table that follows indicates how many binds were required for each source file. For JavaFX Script files, this was ascertained by simply counting the number of occurrences of the bind keyword. As can be seen, binding had been used frequently in the JavaFX Script version (and does not take into consideration an additional half dozen or so “on replace” triggers). The JavaFX 2.0 program achieves the same functionality as the original JavaFX Script version, yet the equivalent of binding was only needed twice throughout the Java version of the source code. JavaFX Script source file Number of Binds JavaFX Next Java source file Number of “Binds” ArrowKey.java 0 Board.fx 1 Board.java 0 BoardNode.fx 7 BoardNode.java 0 ChooseNumberNode.fx 11 ChooseNumberNode.java 0 CloseButtonNode.fx 6 CloseButton.java 0 CustomNodeWithKeyTraversal.java 0 FunctionPtr.java 0 Globals.java 0 Grouping.fx 0 HowToPlayNode.fx 7 HowToPlayNode.java 0 IconButtonNode.fx 9 IconButtonNode.java 0 Main.fx 1 Main.java 0 Main_Mobile.fx 1 SliderNode.fx 6 SliderNode.java 1 Space.fx 0 Space.java 0 SpaceNode.fx 9 SpaceNode.java 1 TraversalHelper.fx 0 Total 58 2 Conclusions As the JavaFX 2.0 technology is so new, and experience with the platform is the same, it is possible and indeed probable that some of the observations noted in the preceding article may not apply across other attempts at migrating applications. That being said, this first experience indicates that the migrated Java code will likely be larger, though not extensively so, than the original Java FX Script source. Furthermore, although very important, it appears that the requirements for data synchronization via binding, may be significantly less with the new platform.

    Read the article

  • Given an XML which contains a representation of a graph, how to apply it DFS algorithm? [on hold]

    - by winston smith
    Given the followin XML which is a directed graph: <?xml version="1.0" encoding="iso-8859-1" ?> <!DOCTYPE graph PUBLIC "-//FC//DTD red//EN" "../dtd/graph.dtd"> <graph direct="1"> <vertex label="V0"/> <vertex label="V1"/> <vertex label="V2"/> <vertex label="V3"/> <vertex label="V4"/> <vertex label="V5"/> <edge source="V0" target="V1" weight="1"/> <edge source="V0" target="V4" weight="1"/> <edge source="V5" target="V2" weight="1"/> <edge source="V5" target="V4" weight="1"/> <edge source="V1" target="V2" weight="1"/> <edge source="V1" target="V3" weight="1"/> <edge source="V1" target="V4" weight="1"/> <edge source="V2" target="V3" weight="1"/> </graph> With this classes i parsed the graph and give it an adjacency list representation: import java.io.IOException; import java.util.HashSet; import java.util.LinkedList; import java.util.Collection; import java.util.Iterator; import java.util.logging.Level; import java.util.logging.Logger; import practica3.util.Disc; public class ParsingXML { public static void main(String[] args) { try { // TODO code application logic here Collection<Vertex> sources = new HashSet<Vertex>(); LinkedList<String> lines = Disc.readFile("xml/directed.xml"); for (String lin : lines) { int i = Disc.find(lin, "source=\""); String data = ""; if (i > 0 && i < lin.length()) { while (lin.charAt(i + 1) != '"') { data += lin.charAt(i + 1); i++; } Vertex v = new Vertex(); v.setName(data); v.setAdy(new HashSet<Vertex>()); sources.add(v); } } Iterator it = sources.iterator(); while (it.hasNext()) { Vertex ver = (Vertex) it.next(); Collection<Vertex> adyacencias = ver.getAdy(); LinkedList<String> ls = Disc.readFile("xml/graphs.xml"); for (String lin : ls) { int i = Disc.find(lin, "target=\""); String data = ""; if (lin.contains("source=\""+ver.getName())) { Vertex v = new Vertex(); if (i > 0 && i < lin.length()) { while (lin.charAt(i + 1) != '"') { data += lin.charAt(i + 1); i++; } v.setName(data); } i = Disc.find(lin, "weight=\""); data = ""; if (i > 0 && i < lin.length()) { while (lin.charAt(i + 1) != '"') { data += lin.charAt(i + 1); i++; } v.setWeight(Integer.parseInt(data)); } if (v.getName() != null) { adyacencias.add(v); } } } } for (Vertex vert : sources) { System.out.println(vert); System.out.println("adyacencias: " + vert.getAdy()); } } catch (IOException ex) { Logger.getLogger(ParsingXML.class.getName()).log(Level.SEVERE, null, ex); } } } This is another class: import java.util.Collection; import java.util.Objects; public class Vertex { private String name; private int weight; private Collection ady; public Collection getAdy() { return ady; } public void setAdy(Collection adyacencias) { this.ady = adyacencias; } public String getName() { return name; } public void setName(String nombre) { this.name = nombre; } public int getWeight() { return weight; } public void setWeight(int weight) { this.weight = weight; } @Override public int hashCode() { int hash = 7; hash = 43 * hash + Objects.hashCode(this.name); hash = 43 * hash + this.weight; return hash; } @Override public boolean equals(Object obj) { if (obj == null) { return false; } if (getClass() != obj.getClass()) { return false; } final Vertex other = (Vertex) obj; if (!Objects.equals(this.name, other.name)) { return false; } if (this.weight != other.weight) { return false; } return true; } @Override public String toString() { return "Vertice{" + "name=" + name + ", weight=" + weight + '}'; } } And finally: /** * * @author user */ /* -*-jde-*- */ /* <Disc.java> Contains the main argument*/ import java.io.*; import java.util.LinkedList; /** * Lectura y escritura de archivos en listas de cadenas * Ideal para el uso de las clases para gráficas. * * @author Peralta Santa Anna Victor Miguel * @since Julio 2011 */ public class Disc { /** * Metodo para lectura de un archivo * * @param fileName archivo que se va a leer * @return El archivo en representacion de lista de cadenas */ public static LinkedList<String> readFile(String fileName) throws IOException { BufferedReader file = new BufferedReader(new FileReader(fileName)); LinkedList<String> textlist = new LinkedList<String>(); while (file.ready()) { textlist.add(file.readLine().trim()); } file.close(); /* for(String linea:textlist){ if(linea.contains("source")){ //String generado = linea.replaceAll("<\\w+\\s+\"", ""); //System.out.println(generado); } }*/ return textlist; }//readFile public static int find(String linea,String palabra){ int i,j; boolean found = false; for(i=0,j=0;i<linea.length();i++){ if(linea.charAt(i)==palabra.charAt(j)){ j++; if(j==palabra.length()){ found = true; return i; } }else{ continue; } } if(!found){ i= -1; } return i; } /** * Metodo para la escritura de un archivo * * @param fileName archivo que se va a escribir * @param tofile la lista de cadenas que quedaran en el archivo * @param append el bit que dira si se anexa el contenido o se empieza de cero */ public static void writeFile(String fileName, LinkedList<String> tofile, boolean append) throws IOException { FileWriter file = new FileWriter(fileName, append); for (int i = 0; i < tofile.size(); i++) { file.write(tofile.get(i) + "\n"); } file.close(); }//writeFile /** * Metodo para escritura de un archivo * @param msg archivo que se va a escribir * @param tofile la cadena que quedaran en el archivo * @param append el bit que dira si se anexa el contenido o se empieza de cero */ public static void writeFile(String msg, String tofile, boolean append) throws IOException { FileWriter file = new FileWriter(msg, append); file.write(tofile); file.close(); }//writeFile }// I'm stuck on what can be the best way to given an adjacency list representation of the graph how to apply it Depth-first search algorithm. Any idea of how to aproach to complete the task?

    Read the article

  • The SSIS tuning tip that everyone misses

    - by Rob Farley
    I know that everyone misses this, because I’m yet to find someone who doesn’t have a bit of an epiphany when I describe this. When tuning Data Flows in SQL Server Integration Services, people see the Data Flow as moving from the Source to the Destination, passing through a number of transformations. What people don’t consider is the Source, getting the data out of a database. Remember, the source of data for your Data Flow is not your Source Component. It’s wherever the data is, within your database, probably on a disk somewhere. You need to tune your query to optimise it for SSIS, and this is what most people fail to do. I’m not suggesting that people don’t tune their queries – there’s plenty of information out there about making sure that your queries run as fast as possible. But for SSIS, it’s not about how fast your query runs. Let me say that again, but in bolder text: The speed of an SSIS Source is not about how fast your query runs. If your query is used in a Source component for SSIS, the thing that matters is how fast it starts returning data. In particular, those first 10,000 rows to populate that first buffer, ready to pass down the rest of the transformations on its way to the Destination. Let’s look at a very simple query as an example, using the AdventureWorks database: We’re picking the different Weight values out of the Product table, and it’s doing this by scanning the table and doing a Sort. It’s a Distinct Sort, which means that the duplicates are discarded. It'll be no surprise to see that the data produced is sorted. Obvious, I know, but I'm making a comparison to what I'll do later. Before I explain the problem here, let me jump back into the SSIS world... If you’ve investigated how to tune an SSIS flow, then you’ll know that some SSIS Data Flow Transformations are known to be Blocking, some are Partially Blocking, and some are simply Row transformations. Take the SSIS Sort transformation, for example. I’m using a larger data set for this, because my small list of Weights won’t demonstrate it well enough. Seven buffers of data came out of the source, but none of them could be pushed past the Sort operator, just in case the last buffer contained the data that would be sorted into the first buffer. This is a blocking operation. Back in the land of T-SQL, we consider our Distinct Sort operator. It’s also blocking. It won’t let data through until it’s seen all of it. If you weren’t okay with blocking operations in SSIS, why would you be happy with them in an execution plan? The source of your data is not your OLE DB Source. Remember this. The source of your data is the NCIX/CIX/Heap from which it’s being pulled. Picture it like this... the data flowing from the Clustered Index, through the Distinct Sort operator, into the SELECT operator, where a series of SSIS Buffers are populated, flowing (as they get full) down through the SSIS transformations. Alright, I know that I’m taking some liberties here, because the two queries aren’t the same, but consider the visual. The data is flowing from your disk and through your execution plan before it reaches SSIS, so you could easily find that a blocking operation in your plan is just as painful as a blocking operation in your SSIS Data Flow. Luckily, T-SQL gives us a brilliant query hint to help avoid this. OPTION (FAST 10000) This hint means that it will choose a query which will optimise for the first 10,000 rows – the default SSIS buffer size. And the effect can be quite significant. First let’s consider a simple example, then we’ll look at a larger one. Consider our weights. We don’t have 10,000, so I’m going to use OPTION (FAST 1) instead. You’ll notice that the query is more expensive, using a Flow Distinct operator instead of the Distinct Sort. This operator is consuming 84% of the query, instead of the 59% we saw from the Distinct Sort. But the first row could be returned quicker – a Flow Distinct operator is non-blocking. The data here isn’t sorted, of course. It’s in the same order that it came out of the index, just with duplicates removed. As soon as a Flow Distinct sees a value that it hasn’t come across before, it pushes it out to the operator on its left. It still has to maintain the list of what it’s seen so far, but by handling it one row at a time, it can push rows through quicker. Overall, it’s a lot more work than the Distinct Sort, but if the priority is the first few rows, then perhaps that’s exactly what we want. The Query Optimizer seems to do this by optimising the query as if there were only one row coming through: This 1 row estimation is caused by the Query Optimizer imagining the SELECT operation saying “Give me one row” first, and this message being passed all the way along. The request might not make it all the way back to the source, but in my simple example, it does. I hope this simple example has helped you understand the significance of the blocking operator. Now I’m going to show you an example on a much larger data set. This data was fetching about 780,000 rows, and these are the Estimated Plans. The data needed to be Sorted, to support further SSIS operations that needed that. First, without the hint. ...and now with OPTION (FAST 10000): A very different plan, I’m sure you’ll agree. In case you’re curious, those arrows in the top one are 780,000 rows in size. In the second, they’re estimated to be 10,000, although the Actual figures end up being 780,000. The top one definitely runs faster. It finished several times faster than the second one. With the amount of data being considered, these numbers were in minutes. Look at the second one – it’s doing Nested Loops, across 780,000 rows! That’s not generally recommended at all. That’s “Go and make yourself a coffee” time. In this case, it was about six or seven minutes. The faster one finished in about a minute. But in SSIS-land, things are different. The particular data flow that was consuming this data was significant. It was being pumped into a Script Component to process each row based on previous rows, creating about a dozen different flows. The data flow would take roughly ten minutes to run – ten minutes from when the data first appeared. The query that completes faster – chosen by the Query Optimizer with no hints, based on accurate statistics (rather than pretending the numbers are smaller) – would take a minute to start getting the data into SSIS, at which point the ten-minute flow would start, taking eleven minutes to complete. The query that took longer – chosen by the Query Optimizer pretending it only wanted the first 10,000 rows – would take only ten seconds to fill the first buffer. Despite the fact that it might have taken the database another six or seven minutes to get the data out, SSIS didn’t care. Every time it wanted the next buffer of data, it was already available, and the whole process finished in about ten minutes and ten seconds. When debugging SSIS, you run the package, and sit there waiting to see the Debug information start appearing. You look for the numbers on the data flow, and seeing operators going Yellow and Green. Without the hint, I’d sit there for a minute. With the hint, just ten seconds. You can imagine which one I preferred. By adding this hint, it felt like a magic wand had been waved across the query, to make it run several times faster. It wasn’t the case at all – but it felt like it to SSIS.

    Read the article

  • BI Applications overview

    - by sv744
    Welcome to Oracle BI applications blog! This blog will talk about various features, general roadmap, description of functionality and implementation steps related to Oracle BI applications. In the first post we start with an overview of the BI apps and will delve deeper into some of the topics below in the upcoming weeks and months. If there are other topics you would like us to talk about, pl feel free to provide feedback on that. The Oracle BI applications are a set of pre-built applications that enable pervasive BI by providing role-based insight for each functional area, including sales, service, marketing, contact center, finance, supplier/supply chain, HR/workforce, and executive management. For example, Sales Analytics includes role-based applications for sales executives, sales management, as well as front-line sales reps, each of whom have different needs. The applications integrate and transform data from a range of enterprise sources—including Siebel, Oracle, PeopleSoft, SAP, and others—into actionable intelligence for each business function and user role. This blog  starts with the key benefits and characteristics of Oracle BI applications. In a series of subsequent blogs, each of these points will be explained in detail. Why BI apps? Demonstrate the value of BI to a business user, show reports / dashboards / model that can answer their business questions as part of the sales cycle. Demonstrate technical feasibility of BI project and significantly lower risk and improve success Build Vs Buy benefit Don’t have to start with a blank sheet of paper. Help consolidate disparate systems Data integration in M&A situations Insulate BI consumers from changes in the OLTP Present OLTP data and highlight issues of poor data / missing data – and improve data quality and accuracy Prebuilt Integrations BI apps support prebuilt integrations against leading ERP sources: Fusion Applications, E- Business Suite, Peoplesoft, JD Edwards, Siebel, SAP Co-developed with inputs from functional experts in BI and Applications teams. Out of the box dimensional model to source model mappings Multi source and Multi Instance support Rich Data Model    BI apps have a very rich dimensionsal data model built over 10 years that incorporates best practises from BI modeling perspective as well as reflect the source system complexities  Thanks for reading a long post, and be on the lookout for future posts.  We will look forward to your valuable feedback on these topics as well as suggestions on what other topics would you like us to cover. I Conformed dimensional model across all business subject areas allows cross functional reporting, e.g. customer / supplier 360 Over 360 fact tables across 7 product areas CRM – 145, SCM – 47, Financials – 28, Procurement – 20, HCM – 27, Projects – 18, Campus Solutions – 21, PLM - 56 Supported by 300 physical dimensions Support for extensive calendars; Gregorian, enterprise and ledger based Conformed data model and metrics for real time vs warehouse based reporting  Multi-tenant enabled Extensive BI related transformations BI apps ETL and data integration support various transformations required for dimensional models and reporting requirements. All these have been distilled into common patterns and abstracted logic which can be readily reused across different modules Slowly Changing Dimension support Hierarchy flattening support Row / Column Hybrid Hierarchy Flattening As Is vs. As Was hierarchy support Currency Conversion :-  Support for 3 corporate, CRM, ledger and transaction currencies UOM conversion Internationalization / Localization Dynamic Data translations Code standardization (Domains) Historical Snapshots Cycle and process lifecycle computations Balance Facts Equalization of GL accounting chartfields/segments Standardized values for categorizing GL accounts Reconciliation between GL and subledgers to track accounted/transferred/posted transactions to GL Materialization of data only available through costly and complex APIs e.g. Fusion Payroll, EBS / Fusion Accruals Complex event Interpretation of source data – E.g. o    What constitutes a transfer o    Deriving supervisors via position hierarchy o    Deriving primary assignment in PSFT o    Categorizing and transposition to measures of Payroll Balances to specific metrics to support side by side comparison of measures of for example Fixed Salary, Variable Salary, Tax, Bonus, Overtime Payments. o    Counting of Events – E.g. converting events to fact counters so that for example the number of hires can easily be added up and compared alongside the total transfers and terminations. Multi pass processing of multiple sources e.g. headcount, salary, promotion, performance to allow side to side comparison. Adding value to data to aid analysis through banding, additional domain classifications and groupings to allow higher level analytical reporting and data discovery Calculation of complex measures examples: o    COGs, DSO, DPO, Inventory turns  etc o    Transfers within a Hierarchy or out of / into a hierarchy relative to view point in hierarchy. Configurability and Extensibility support  BI apps offer support for extensibility for various entities as automated extensibility or part of extension methodology Key Flex fields and Descriptive Flex support  Extensible attribute support (JDE)  Conformed Domains ETL Architecture BI apps offer a modular adapter architecture which allows support of multiple product lines into a single conformed model Multi Source Multi Technology Orchestration – creates load plan taking into account task dependencies and customers deployment to generate a plan based on a customers of multiple complex etl tasks Plan optimization allowing parallel ETL tasks Oracle: Bit map indexes and partition management High availability support    Follow the sun support. TCO BI apps support several utilities / capabilities that help with overall total cost of ownership and ensure a rapid implementation Improved cost of ownership – lower cost to deploy On-going support for new versions of the source application Task based setups flows Data Lineage Functional setup performed in Web UI by Functional person Configuration Test to Production support Security BI apps support both data and object security enabling implementations to quickly configure the application as per the reporting security needs Fine grain object security at report / dashboard and presentation catalog level Data Security integration with source systems  Extensible to support external data security rules Extensive Set of KPIs Over 7000 base and derived metrics across all modules Time series calculations (YoY, % growth etc) Common Currency and UOM reporting Cross subject area KPIs (analyzing HR vs GL data, drill from GL to AP/AR, etc) Prebuilt reports and dashboards 3000+ prebuilt reports supporting a large number of industries Hundreds of role based dashboards Dynamic currency conversion at dashboard level Highly tuned Performance The BI apps have been tuned over the years for both a very performant ETL and dashboard performance. The applications use best practises and advanced database features to enable the best possible performance. Optimized data model for BI and analytic queries Prebuilt aggregates& the ability for customers to create their own aggregates easily on warehouse facts allows for scalable end user performance Incremental extracts and loads Incremental Aggregate build Automatic table index and statistics management Parallel ETL loads Source system deletes handling Low latency extract with Golden Gate Micro ETL support Bitmap Indexes Partitioning support Modularized deployment, start small and add other subject areas seamlessly Source Specfic Staging and Real Time Schema Support for source specific operational reporting schema for EBS, PSFT, Siebel and JDE Application Integrations The BI apps also allow for integration with source systems as well as other applications that provide value add through BI and enable BI consumption during operational decision making Embedded dashboards for Fusion, EBS and Siebel applications Action Link support Marketing Segmentation Sales Predictor Dashboard Territory Management External Integrations The BI apps data integration choices include support for loading extenral data External data enrichment choices : UNSPSC, Item class etc. Extensible Spend Classification Broad Deployment Choices Exalytics support Databases :  Oracle, Exadata, Teradata, DB2, MSSQL ETL tool of choice : ODI (coming), Informatica Extensible and Customizable Extensible architecture and Methodology to add custom and external content Upgradable across releases

    Read the article

  • How to disable proxy requests once a server has been added to spammers "open proxy" list?

    - by Matt
    Hello all, I've just started in a new company, and have been going over the setup of their Apache webserver conf files... only to find that they've had their apache servers set up as open proxies available to all the world for the last two months. I've already set ProxyRequests Off in the httpd.conf file and restarted the web server, but the access log file is still growing at a horrendous rate (about a gig a day). I noticed that another question was posted on here about this (http://serverfault.com/questions/63715/apache-hit-with-proxy-request), but their access log was supposedly returning 404 errors, while mine appears to be returning 403 and 404 codes... Is this correct? Here are a few lines out of my access log: 87.118.118.124 - - [16/Mar/2010:10:56:36 -0400] "GET http://www.c5interlude.ru/torrent/viewtopic.php?p=2501 HTTP/1.0" 404 219 "http://www.c5interlude.ru/torrent/viewtopic.php?p=2501" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; .NET CLR 1.1.4322)" 117.41.184.27 - - [16/Mar/2010:10:56:36 -0400] "GET http://ad.xtendmedia.com/st?ad_type=iframe&ad_size=300x250&section=790074 HTTP/1.0" 404 200 "http://www.newbiegamer.com" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0; Alexa Toolbar)" 122.224.55.222 - - [16/Mar/2010:10:56:36 -0400] "GET http://www.188woool.net/\xb4\xf3\xd4\xcb\xb4\xab\xca\xc0.rar HTTP/1.1" 403 214 "http://www.188woool.net/\xb4\xf3\xd4\xcb\xb4\xab\xca\xc0.rar" "Mozilla/4.0" 58.55.21.40 - - [16/Mar/2010:10:56:36 -0400] "GET http://www.cpx24.com/ad1.js HTTP/1.0" 404 204 "http://thebighits.com/?id=aibux" "Mozilla/4.0 (compatible; MSIE 5.01; Windows NT 5.0)" 122.226.223.188 - - [16/Mar/2010:10:56:36 -0400] "GET http://ad.reduxmedia.com/st?ad_type=iframe&ad_size=160x600&section=798636 HTTP/1.0" 404 200 "http://www.gvvu.com" "Mozilla/4.0 (compatible; MSIE 5.5; AOL 6.0; Windows 98; Win 9x 4.90)" 84.51.109.31 - - [16/Mar/2010:10:56:36 -0400] "GET http://www.kslp.ru/forum/index.php HTTP/1.0" 404 213 "http://www.kslp.ru/forum/index.php" "Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 6.0 ; .NET CLR 2.0.50215; SL Commerce Client v1.0; Tablet PC 2.0" 122.224.48.49 - - [16/Mar/2010:10:56:36 -0400] "GET http://www1.vip218.com/\xb2\xca\xba\xe7\xb4\xab\xca\xc0.exe HTTP/1.1" 403 214 "http://www1.vip218.com/\xb2\xca\xba\xe7\xb4\xab\xca\xc0.exe" "Mozilla/4.0" 117.41.184.27 - - [16/Mar/2010:10:56:36 -0400] "GET http://ad.xtendmedia.com/st?ad_type=iframe&ad_size=728x90&section=657624 HTTP/1.0" 404 200 "http://www.raiseanimals.com" "Mozilla/4.0 (compatible; MSIE 6.0; Windows 98; Alexa Toolbar)" And my corresponding error log entries: [Tue Mar 16 10:56:36 2010] [error] [client 87.118.118.124] File does not exist: C:/public_html/torrent, referer: http://www.c5interlude.ru/torrent/viewtopic.php?p=2501 [Tue Mar 16 10:56:36 2010] [error] [client 117.41.184.27] File does not exist: C:/public_html/st, referer: http://www.newbiegamer.com [Tue Mar 16 10:56:36 2010] [error] [client 122.224.55.222] (22)Invalid argument: Cannot map GET http://www.188woool.net/\xb4\xf3\xd4\xcb\xb4\xab\xca\xc0.rar HTTP/1.1 to file, referer: http://www.188woool.net/\xb4\xf3\xd4\xcb\xb4\xab\xca\xc0.rar [Tue Mar 16 10:56:36 2010] [error] [client 58.55.21.40] File does not exist: C:/public_html/ad1.js, referer: http://thebighits.com/?id=aibux [Tue Mar 16 10:56:36 2010] [error] [client 122.226.223.188] File does not exist: C:/public_html/st, referer: http://www.gvvu.com [Tue Mar 16 10:56:36 2010] [error] [client 84.51.109.31] File does not exist: C:/public_html/forum, referer: http://www.kslp.ru/forum/index.php [Tue Mar 16 10:56:36 2010] [error] [client 122.224.48.49] (22)Invalid argument: Cannot map GET http://www1.vip218.com/\xb2\xca\xba\xe7\xb4\xab\xca\xc0.exe HTTP/1.1 to file, referer: http://www1.vip218.com/\xb2\xca\xba\xe7\xb4\xab\xca\xc0.exe [Tue Mar 16 10:56:36 2010] [error] [client 117.41.184.27] File does not exist: C:/public_html/st, referer: http://www.raiseanimals.com Does this in fact look like the server is blocking them correctly, and is there anything else that I could do better to cut down on my access log size? (perhaps block these requests from the server completely?) Thanks! Matt

    Read the article

  • while running mvn jetty:run showing the following error ..

    - by munna
    C:\source\myprojectmvn jetty:run [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Building AppFuse Spring MVC Application [INFO] task-segment: [jetty:run] [INFO] ------------------------------------------------------------------------ [INFO] Preparing jetty:run [WARNING] POM for 'xfire:xfire-jsr181-api:pom:1.0-M1:compile' is invalid. Its dependencies (if any) will NOT be available to the current build. [INFO] [warpath:add-classes {execution: default}] [INFO] [aspectj:compile {execution: default}] [INFO] [native2ascii:native2ascii {execution: native2ascii-utf8}] [INFO] [native2ascii:native2ascii {execution: native2ascii-8859_1}] [INFO] [resources:resources {execution: default-resources}] [WARNING] File encoding has not been set, using platform encoding Cp1252, i.e. b uild is platform dependent! [WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources, i.e. build is platform dependent! [INFO] Copying 12 resources [INFO] Copying 1 resource [INFO] Copying 26 resources [INFO] Copying 26 resources [INFO] [compiler:compile {execution: default-compile}] [INFO] Nothing to compile - all classes are up to date [INFO] [resources:testResources {execution: default-testResources}] [WARNING] File encoding has not been set, using platform encoding Cp1252, i.e. b uild is platform dependent! [WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources, i.e. build is platform dependent! [INFO] Copying 4 resources [INFO] Copying 9 resources [INFO] Preparing hibernate3:hbm2ddl [WARNING] Removing: hbm2ddl from forked lifecycle, to prevent recursive invocati on. [INFO] [warpath:add-classes {execution: default}] [INFO] [aspectj:compile {execution: default}] [INFO] [native2ascii:native2ascii {execution: native2ascii-utf8}] [INFO] [native2ascii:native2ascii {execution: native2ascii-8859_1}] [INFO] [resources:resources {execution: default-resources}] [WARNING] File encoding has not been set, using platform encoding Cp1252, i.e. b uild is platform dependent! [WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources, i.e. build is platform dependent! [INFO] Copying 12 resources [INFO] Copying 1 resource [INFO] Copying 26 resources [INFO] Copying 26 resources [INFO] Copying 26 resources [INFO] Copying 26 resources [INFO] [hibernate3:hbm2ddl {execution: default}] [INFO] Configuration XML file loaded: file:/C:/source/myproject/src/main/resourc es/hibernate.cfg.xml [INFO] Configuration XML file loaded: file:/C:/source/myproject/src/main/resourc es/hibernate.cfg.xml [INFO] Configuration Properties file loaded: C:\source\myproject\target\classes\ jdbc.properties alter table user_role drop foreign key FK143BF46A4FD90D75; alter table user_role drop foreign key FK143BF46AF503D155; drop table if exists app_user; drop table if exists role; drop table if exists user_role; create table app_user (id bigint not null auto_increment, account_expired bit no t null, account_locked bit not null, address varchar(150), city varchar(50) not null, country varchar(100), postal_code varchar(15) not null, province varchar(1 00), credentials_expired bit not null, email varchar(255) not null unique, accou nt_enabled bit, first_name varchar(50) not null, last_name varchar(50) not null, password varchar(255) not null, password_hint varchar(255), phone_number varcha r(255), username varchar(50) not null unique, version integer, website varchar(2 55), primary key (id)) ENGINE=InnoDB; create table role (id bigint not null auto_increment, description varchar(64), n ame varchar(20), primary key (id)) ENGINE=InnoDB; create table user_role (user_id bigint not null, role_id bigint not null, primar y key (user_id, role_id)) ENGINE=InnoDB; alter table user_role add index FK143BF46A4FD90D75 (role_id), add constraint FK1 43BF46A4FD90D75 foreign key (role_id) references role (id); alter table user_role add index FK143BF46AF503D155 (user_id), add constraint FK1 43BF46AF503D155 foreign key (user_id) references app_user (id); [INFO] [compiler:testCompile {execution: default-testCompile}] [INFO] Nothing to compile - all classes are up to date [INFO] [dbunit:operation {execution: test-compile}] [INFO] [jetty:run {execution: default-cli}] [INFO] Configuring Jetty for project: AppFuse Spring MVC Application [INFO] Webapp source directory = C:\source\myproject\src\main\webapp [INFO] web.xml file = C:\source\myproject\src\main\webapp\WEB-INF\web.xml [INFO] Classes = C:\source\myproject\target\classes [INFO] Adding extra scan target from pattern: C:\source\myproject\src\main\webap p\WEB-INF\applicationContext-validation.xml [INFO] Adding extra scan target from pattern: C:\source\myproject\src\main\webap p\WEB-INF\applicationContext.xml [INFO] Adding extra scan target from pattern: C:\source\myproject\src\main\webap p\WEB-INF\dispatcher-servlet.xml [INFO] Adding extra scan target from pattern: C:\source\myproject\src\main\webap p\WEB-INF\menu-config.xml [INFO] Adding extra scan target from pattern: C:\source\myproject\src\main\webap p\WEB-INF\urlrewrite.xml [INFO] Adding extra scan target from pattern: C:\source\myproject\src\main\webap p\WEB-INF\validation.xml [INFO] Adding extra scan target from pattern: C:\source\myproject\src\main\webap p\WEB-INF\validator-rules-custom.xml [INFO] Adding extra scan target from pattern: C:\source\myproject\src\main\webap p\WEB-INF\validator-rules.xml [INFO] Adding extra scan target from pattern: C:\source\myproject\src\main\webap p\WEB-INF\web.xml 2010-06-02 15:13:28.921::INFO: Logging to STDERR via org.mortbay.log.StdErrLog [INFO] Context path = / [INFO] Tmp directory = determined at runtime [INFO] Web defaults = org/mortbay/jetty/webapp/webdefault.xml [INFO] Web overrides = none [INFO] Webapp directory = C:\source\myproject\src\main\webapp [INFO] Starting jetty 6.1.9 ... 2010-06-02 15:13:28.983::INFO: jetty-6.1.9 2010-06-02 15:13:28.248::INFO: No Transaction manager found - if your webapp re quires one, please configure one. 2010-06-02 15:13:28.482:/:INFO: Initializing Spring root WebApplicationContext [myproject] ERROR [main] ContextLoader.initWebApplicationContext(215) | Context initialization failed org.springframework.beans.factory.BeanDefinitionStoreException: IOException pars ing XML document from ServletContext resource [/WEB-INF/xfire-servlet.xml]; nest ed exception is java.io.FileNotFoundException: Could not open ServletContext res ource [/WEB-INF/xfire-servlet.xml] at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBea nDefinitions(XmlBeanDefinitionReader.java:349) at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBea nDefinitions(XmlBeanDefinitionReader.java:310) at org.springframework.beans.factory.support.AbstractBeanDefinitionReade r.loadBeanDefinitions(AbstractBeanDefinitionReader.java:143) at org.springframework.beans.factory.support.AbstractBeanDefinitionReade r.loadBeanDefinitions(AbstractBeanDefinitionReader.java:178) at org.springframework.beans.factory.support.AbstractBeanDefinitionReade r.loadBeanDefinitions(AbstractBeanDefinitionReader.java:149) at org.springframework.web.context.support.XmlWebApplicationContext.load BeanDefinitions(XmlWebApplicationContext.java:124) at org.springframework.web.context.support.XmlWebApplicationContext.load BeanDefinitions(XmlWebApplicationContext.java:92) at org.springframework.context.support.AbstractRefreshableApplicationCon text.refreshBeanFactory(AbstractRefreshableApplicationContext.java:123) at org.springframework.context.support.AbstractApplicationContext.obtain FreshBeanFactory(AbstractApplicationContext.java:423) at org.springframework.context.support.AbstractApplicationContext.refres h(AbstractApplicationContext.java:353) at org.springframework.web.context.ContextLoader.createWebApplicationCon text(ContextLoader.java:255) at org.springframework.web.context.ContextLoader.initWebApplicationConte xt(ContextLoader.java:199) at org.springframework.web.context.ContextLoaderListener.contextInitiali zed(ContextLoaderListener.java:45) at org.mortbay.jetty.handler.ContextHandler.startContext(ContextHandler. java:540) at org.mortbay.jetty.servlet.Context.startContext(Context.java:135) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.jav a:1220) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java: 510) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:448 ) at org.mortbay.jetty.plugin.Jetty6PluginWebAppContext.doStart(Jetty6Plug inWebAppContext.java:110) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java: 39) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection .java:152) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHan dlerCollection.java:156) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java: 39) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection .java:152) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java: 39) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java: 130) at org.mortbay.jetty.Server.doStart(Server.java:222) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java: 39) at org.mortbay.jetty.plugin.Jetty6PluginServer.start(Jetty6PluginServer. java:132) at org.mortbay.jetty.plugin.AbstractJettyMojo.startJetty(AbstractJettyMo jo.java:357) at org.mortbay.jetty.plugin.AbstractJettyMojo.execute(AbstractJettyMojo. java:293) at org.mortbay.jetty.plugin.AbstractJettyRunMojo.execute(AbstractJettyRu nMojo.java:203) at org.mortbay.jetty.plugin.Jetty6RunMojo.execute(Jetty6RunMojo.java:184 ) at org.apache.maven.plugin.DefaultPluginManager.executeMojo(DefaultPlugi nManager.java:490) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(Defa ultLifecycleExecutor.java:694) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeStandalone Goal(DefaultLifecycleExecutor.java:569) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoal(Defau ltLifecycleExecutor.java:539) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalAndHan dleFailures(DefaultLifecycleExecutor.java:387) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeTaskSegmen ts(DefaultLifecycleExecutor.java:348) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.execute(DefaultLi fecycleExecutor.java:180) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:328) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:138) at org.apache.maven.cli.MavenCli.main(MavenCli.java:362) at org.apache.maven.cli.compat.CompatibleMain.main(CompatibleMain.java:6 0) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.codehaus.classworlds.Launcher.launchEnhanced(Launcher.java:315) at org.codehaus.classworlds.Launcher.launch(Launcher.java:255) at org.codehaus.classworlds.Launcher.mainWithExitCode(Launcher.java:430) at org.codehaus.classworlds.Launcher.main(Launcher.java:375) Caused by: java.io.FileNotFoundException: Could not open ServletContext resource [/WEB-INF/xfire-servlet.xml] at org.springframework.web.context.support.ServletContextResource.getInp utStream(ServletContextResource.java:116) at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBea nDefinitions(XmlBeanDefinitionReader.java:336) ... 51 more 2010-06-02 15:13:29.919::WARN: Failed startup of context org.mortbay.jetty.plug in.Jetty6PluginWebAppContext@1ba4806{/,C:\source\myproject\src\main\webapp} org.springframework.beans.factory.BeanDefinitionStoreException: IOException pars ing XML document from ServletContext resource [/WEB-INF/xfire-servlet.xml]; nest ed exception is java.io.FileNotFoundException: Could not open ServletContext res ource [/WEB-INF/xfire-servlet.xml] at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBea nDefinitions(XmlBeanDefinitionReader.java:349) at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBea nDefinitions(XmlBeanDefinitionReader.java:310) at org.springframework.beans.factory.support.AbstractBeanDefinitionReade r.loadBeanDefinitions(AbstractBeanDefinitionReader.java:143) at org.springframework.beans.factory.support.AbstractBeanDefinitionReade r.loadBeanDefinitions(AbstractBeanDefinitionReader.java:178) at org.springframework.beans.factory.support.AbstractBeanDefinitionReade r.loadBeanDefinitions(AbstractBeanDefinitionReader.java:149) at org.springframework.web.context.support.XmlWebApplicationContext.load BeanDefinitions(XmlWebApplicationContext.java:124) at org.springframework.web.context.support.XmlWebApplicationContext.load BeanDefinitions(XmlWebApplicationContext.java:92) at org.springframework.context.support.AbstractRefreshableApplicationCon text.refreshBeanFactory(AbstractRefreshableApplicationContext.java:123) at org.springframework.context.support.AbstractApplicationContext.obtain FreshBeanFactory(AbstractApplicationContext.java:423) at org.springframework.context.support.AbstractApplicationContext.refres h(AbstractApplicationContext.java:353) at org.springframework.web.context.ContextLoader.createWebApplicationCon text(ContextLoader.java:255) at org.springframework.web.context.ContextLoader.initWebApplicationConte xt(ContextLoader.java:199) at org.springframework.web.context.ContextLoaderListener.contextInitiali zed(ContextLoaderListener.java:45) at org.mortbay.jetty.handler.ContextHandler.startContext(ContextHandler. java:540) at org.mortbay.jetty.servlet.Context.startContext(Context.java:135) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.jav a:1220) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java: 510) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:448 ) at org.mortbay.jetty.plugin.Jetty6PluginWebAppContext.doStart(Jetty6Plug inWebAppContext.java:110) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java: 39) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection .java:152) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHan dlerCollection.java:156) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java: 39) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection .java:152) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java: 39) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java: 130) at org.mortbay.jetty.Server.doStart(Server.java:222) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java: 39) at org.mortbay.jetty.plugin.Jetty6PluginServer.start(Jetty6PluginServer. java:132) at org.mortbay.jetty.plugin.AbstractJettyMojo.startJetty(AbstractJettyMo jo.java:357) at org.mortbay.jetty.plugin.AbstractJettyMojo.execute(AbstractJettyMojo. java:293) at org.mortbay.jetty.plugin.AbstractJettyRunMojo.execute(AbstractJettyRu nMojo.java:203) at org.mortbay.jetty.plugin.Jetty6RunMojo.execute(Jetty6RunMojo.java:184 ) at org.apache.maven.plugin.DefaultPluginManager.executeMojo(DefaultPlugi nManager.java:490) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(Defa ultLifecycleExecutor.java:694) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeStandalone Goal(DefaultLifecycleExecutor.java:569) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoal(Defau ltLifecycleExecutor.java:539) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalAndHan dleFailures(DefaultLifecycleExecutor.java:387) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeTaskSegmen ts(DefaultLifecycleExecutor.java:348) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.execute(DefaultLi fecycleExecutor.java:180) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:328) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:138) at org.apache.maven.cli.MavenCli.main(MavenCli.java:362) at org.apache.maven.cli.compat.CompatibleMain.main(CompatibleMain.java:6 0) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.codehaus.classworlds.Launcher.launchEnhanced(Launcher.java:315) at org.codehaus.classworlds.Launcher.launch(Launcher.java:255) at org.codehaus.classworlds.Launcher.mainWithExitCode(Launcher.java:430) at org.codehaus.classworlds.Launcher.main(Launcher.java:375) Caused by: java.io.FileNotFoundException: Could not open ServletContext resource [/WEB-INF/xfire-servlet.xml] at org.springframework.web.context.support.ServletContextResource.getInp utStream(ServletContextResource.java:116) at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBea nDefinitions(XmlBeanDefinitionReader.java:336) ... 51 more 2010-06-02 15:13:29.152::WARN: Nested in org.springframework.beans.factory.Bean DefinitionStoreException: IOException parsing XML document from ServletContext r esource [/WEB-INF/xfire-servlet.xml]; nested exception is java.io.FileNotFoundEx ception: Could not open ServletContext resource [/WEB-INF/xfire-servlet.xml]: java.io.FileNotFoundException: Could not open ServletContext resource [/WEB-INF/ xfire-servlet.xml] at org.springframework.web.context.support.ServletContextResource.getInp utStream(ServletContextResource.java:116) at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBea nDefinitions(XmlBeanDefinitionReader.java:336) at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBea nDefinitions(XmlBeanDefinitionReader.java:310) at org.springframework.beans.factory.support.AbstractBeanDefinitionReade r.loadBeanDefinitions(AbstractBeanDefinitionReader.java:143) at org.springframework.beans.factory.support.AbstractBeanDefinitionReade r.loadBeanDefinitions(AbstractBeanDefinitionReader.java:178) at org.springframework.beans.factory.support.AbstractBeanDefinitionReade r.loadBeanDefinitions(AbstractBeanDefinitionReader.java:149) at org.springframework.web.context.support.XmlWebApplicationContext.load BeanDefinitions(XmlWebApplicationContext.java:124) at org.springframework.web.context.support.XmlWebApplicationContext.load BeanDefinitions(XmlWebApplicationContext.java:92) at org.springframework.context.support.AbstractRefreshableApplicationCon text.refreshBeanFactory(AbstractRefreshableApplicationContext.java:123) at org.springframework.context.support.AbstractApplicationContext.obtain FreshBeanFactory(AbstractApplicationContext.java:423) at org.springframework.context.support.AbstractApplicationContext.refres h(AbstractApplicationContext.java:353) at org.springframework.web.context.ContextLoader.createWebApplicationCon text(ContextLoader.java:255) at org.springframework.web.context.ContextLoader.initWebApplicationConte xt(ContextLoader.java:199) at org.springframework.web.context.ContextLoaderListener.contextInitiali zed(ContextLoaderListener.java:45) at org.mortbay.jetty.handler.ContextHandler.startContext(ContextHandler. java:540) at org.mortbay.jetty.servlet.Context.startContext(Context.java:135) at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.jav a:1220) at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java: 510) at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:448 ) at org.mortbay.jetty.plugin.Jetty6PluginWebAppContext.doStart(Jetty6Plug inWebAppContext.java:110) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java: 39) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection .java:152) at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHan dlerCollection.java:156) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java: 39) at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection .java:152) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java: 39) at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java: 130) at org.mortbay.jetty.Server.doStart(Server.java:222) at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java: 39) at org.mortbay.jetty.plugin.Jetty6PluginServer.start(Jetty6PluginServer. java:132) at org.mortbay.jetty.plugin.AbstractJettyMojo.startJetty(AbstractJettyMo jo.java:357) at org.mortbay.jetty.plugin.AbstractJettyMojo.execute(AbstractJettyMojo. java:293) at org.mortbay.jetty.plugin.AbstractJettyRunMojo.execute(AbstractJettyRu nMojo.java:203) at org.mortbay.jetty.plugin.Jetty6RunMojo.execute(Jetty6RunMojo.java:184 ) at org.apache.maven.plugin.DefaultPluginManager.executeMojo(DefaultPlugi nManager.java:490) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(Defa ultLifecycleExecutor.java:694) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeStandalone Goal(DefaultLifecycleExecutor.java:569) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoal(Defau ltLifecycleExecutor.java:539) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalAndHan dleFailures(DefaultLifecycleExecutor.java:387) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeTaskSegmen ts(DefaultLifecycleExecutor.java:348) at org.apache.maven.lifecycle.DefaultLifecycleExecutor.execute(DefaultLi fecycleExecutor.java:180) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:328) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:138) at org.apache.maven.cli.MavenCli.main(MavenCli.java:362) at org.apache.maven.cli.compat.CompatibleMain.main(CompatibleMain.java:6 0) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.codehaus.classworlds.Launcher.launchEnhanced(Launcher.java:315) at org.codehaus.classworlds.Launcher.launch(Launcher.java:255) at org.codehaus.classworlds.Launcher.mainWithExitCode(Launcher.java:430) at org.codehaus.classworlds.Launcher.main(Launcher.java:375) 2010-06-02 15:13:29.417::INFO: Started [email protected]:8080 [INFO] Started Jetty Server [INFO] Starting scanner at interval of 3 seconds.

    Read the article

  • How can I execute an insert with data from a repeater-generated form whose data source is SQL?

    - by Duke
    I'm storing multilingual data in a database whose model is language normalized (like this). For this particular problem the key for the table in question consists of a value entered by the user and a language from the language table. I'd like to dynamically generate a form with input fields for all available languages. The user inputs a key value then goes down a list of field sets filling out the information in each language. In this case there are two fields for every language, a name and a value (the value is language dependent.) I have all existing information displayed on the page with a gridview, below which I have a formview that is always in insert mode allowing the user to enter new data. Within the formview I have a repeater with an SQLDataSource that gets a list of available languages: <asp:Repeater ID="SessionLocaleRepeater" runat="server" DataSourceID="LocaleSQLDataSource" EnableViewState="false"> <ItemTemplate> <tr> <th scope="row"><%# DataBinder.Eval(Container.DataItem, "LocaleName") %></th> <td>Name:</td> <td><asp:TextBox ID="TextBox1" runat="server" Text="" /></td> <td>Number:</td> <td><asp:TextBox ID="TextBox2" runat="server" Text="" /></td> </tr> </ItemTemplate> </asp:Repeater> I figured that in order to insert this data I'd have to execute my sql server insert stored procedure for each item in the repeater; I am trying to use the formview inserting event. The problem is that the repeater isn't databound to the SQLDataSource until after the formview inserting event (inserting event is in PostBackEvent and databind is in PreRender), which means the controls and data are not available when the inserting event is fired. I tried databinding the repeater during the formview inserting event; the controls were available but the data was not. Would this have something to do with how/when the viewstate information is re-added to the controls? From what I've read, Viewstate is one of the first things to be restored. Given the order of events how can I get the data I need for the insert? I'm open to other solutions to creating dynamic input controls, but they will have to query the database to determine how many sets of controls to create.

    Read the article

  • Unable to open Visual Studio 2005 solution attached to VSS?

    - by Amitabh
    I am unable to open a Visual Studio 2005 solution under vss. The solution contains an Asp.net web site and around 10 more projects. I have already taken the latest source code from vss client. I have also made everything writeable. All I want to do is to see if the latest from Vss compiles in my machine. Is there anything I am missing here?

    Read the article

  • CodePlex Daily Summary for Monday, April 19, 2010

    CodePlex Daily Summary for Monday, April 19, 2010New Projects8085 Microprocessor simulator: This program allows you to write 8085 programs in assembly and run those programs on your PC. It comes with lots of help, plus you can put breakpo...Additional.NET framework: The Additional.Net framework extends the functionality of the .NET framework for easier application development. It is developed in C#.Astoria Contrib: A contrib project for filling the gaps in WCF Data Services, providing missing functionality or augmenting with T4 templates, helpers, etc.ClipoWeb: ClipoWeb is a web clipboard that allows you to copy text and files between computers. Users access a web page on the source and destination compute...elearning Center: Đây là một ứng dụng web viết hoàn toàn bằng Sliverlight. Ứng dụng này là một dạng elearning với đầy đủ chức năng và có khả năng tương tác tối đa v...Excel VSTO SQL Server Browser: Get Data from SQL Server and put it in Excel directly. The objective is to get more control about what do you need to pull and create automatic pro...Generic Tree Structure: Generic Base Classes that helps you to create complex tree structures without writing it again and again. Simply to use Like "var Node<Folder> fold...LAN Lordz LAN Party System: The LAN Lordz LAN Party System makes it easier for medium and large size events to track their attendance, sponsors, door prizes, tournaments, and...LiteFx: O LiteFx é um framework que ajuda na implementação de DDD (anêmico ou rico) ele foi desenvolvido por Douglas Aguiar (http://twitter.com/DouglasAgui...Managed UI Flow for ASP.NET MVC Framework: If your web application getting more complex, understanding and managing of complex UI flows(pageflow of application) getting harder and harder, If...Meus Exemplos: Meus ExemplosOrchard Blueprint Theme: Orchard BluePrint is a project that provides a WYSIWYG reference implementation of a Orchard theme to help designers get started with theme design....Outlook Social Network Connector - Avatar: Avatar 是一个开源的MS Outlook的插件,豆瓣用户可以在Outlook 2010中使用豆瓣。查看一封邮件中相关的收件人、发件人的用户广播、同城活动以及豆邮。不用上豆瓣也能方便了解好友动态。这个插件使用C#, .NET 4.0 开发。API 请求认证使用OAuth 认证。 (Avat...Quadro Tree: This is Quadri tree library.Sharepoint 2010 Alert Controller: In MOSS 2007 or Sharepoint Server 2010 if you want to see your alerts by list name you should use this tool.SharePoint Web Parts: The goal of this project is to develop a set of web parts for SharePoint.Silverlight Image Cropper: This is a silverlight 4 util that makes it easy to crop out a number images of a specific resolution screen or screens. ie. an easy way to crop ...SilverlightFTP: Silverlight ftp clientsplibex: libraries for sharepoint lists manipulationStardustExtensions: Official Extensions for StardustSwim Team Manager: Swim Team Manager is designed for managing and tracking administrative and performance information for your club, school, or swim team. Swim Tea...ToDoListWpf: A To Do List, I used it to manage my work items. I am sorry for my poor English.Trance Layer: TranceLayer is a fast and flexible logging or diagnostics framework for .Net. It allows you to plug it into an existing or new application with m...Unoficcial NeoFM.hu NowPlaying: A little windows tray program. Shows what's on neofm.hu right now.WabbitStudio Z80 Software Tools: The software suite provides all of the tools you need to create high quality Z80 software in Z80 assembly language, with a focus on TI calculators....WinToolbar: Windows.Toolbar is Silverlight library that implements common widgets that allows us to build a rich toolbar control in our applications, it incor...XP-More: XP-More is a tool that helps manage Windows 7 Virtual Machines (XP Mode and any other). Specifically, it makes duplication of VMs a no brainer - no...Yodelay .NET Framework Extensions: The Yodelay .NET Framework Extensions project provides a library of components that make many kinds of programming tasks simpler. These include bas...New ReleasesClipoWeb: ClipoWeb 1.0: First Beta release of the ClipoWeb web applicationDDDSample.Net: 0.8: This release contains all four versions of DDDSample.Net available in previous, 0.7 and a brand new one: Layered Model version. Layered Model demon...DotNetNuke Blueprint: 00.00.02: Added to this version CSS Reset Skin version including Grids This version will soon be updated with corresponding HTML version and DNN templateEsferatec.Text.RegularExpressions: 3.5.1003.1001: first stable release of the class; the assembly file is ready to use, the documentation is complete;Excel VSTO SQL Server Browser: Sample Only: Sample without Ribbon UI, if you close the TaskPane you will no longer able to open it without restart ExcelFolder Bookmarks: Folder Bookmarks 1.5.5: This is the latest version of Folder Bookmarks (1.5.5), with the new Archive Manager and Archive Viewer. It has an installer - it will create a dir...Gardens Point LEX: Gardens Point LEX, Version 1.1.3: The main distribution is a zip file. This contains the binary executable, documentation, source code and the examples. ChangesVersion 1.1.3 corre...Gardens Point Parser Generator: Gardens Point Parser Generator V1.4.0: The distribution is a zip archive which contains the binary executables, documentation, source code and examples. ChangesVersion 1.4.0 of GPPG has...HKGolden Express: HKGoldenExpress (Build 201004181455): New features: Added rating of each topic. (Note: This feature is availabe since Build 201004172120) Bug fix: Handle invalid XML character in XML...Home Access Plus+: v4.0.0.0 Beta: v4.0.0.0 Beta Change Log: Moved to using .net 4.0 New Silverlight Uploader Various .net 4 fixes and tweaks File Changes: All fixes have changedHTML Ruby: 6.21.6: Reduced performance hit on pages with heavy DOM manipulations Fixed issue where empty tags caused it to apply invalid spacing values Stop spaci...LINQ to VFP: LinqToVfp (v1.0.17.2): Modified to allow using RecNo as a primary key. This build requires IQToolkit v0.17b.Managed UI Flow for ASP.NET MVC Framework: Preview 1: The source available on this site, does not reflect the final state of the project, it is a preview of what will be shipping in the framework in th...MVVM Light Toolkit: MVVM Light Toolkit V3 SP1 (2): Super minor update to accommodate the new Blend 4 RC. Only changes: The path to the Blend 4 templates changed to be "My Documents\Expression\Blend...N2 CMS: 2.0 rc: N2 is a lightweight CMS framework for ASP.NET. It helps you build great web sites that anyone can update. Major Changes (1.5 -> 2.0 release candid...OpenGL ES 2.0 Compact Framework Wrapper: Sample application CAB with texturing: This took some time as it was pretty hard to get the texture loaded and setup so that it would bind to the sampler2D in the fragment shader. Featu...Orchard Blueprint Theme: 00.00.01: This is the first release of this project, still in a very alpha version. Very soon this release is to be updated with the HTML version of the them...RoughJs: RoughJsSL: This is Silverlight library's CompilerSharepoint 2010 Alert Controller: Sharepoint 2010 Alert Controller: After you download WSP file you can get help from Home PageSharePoint LogViewer: SharePoint LogViewer 2.5: Minimize log viewer to tray Get popup notification of SharePoint log events from tray Redirect log entries to event log Send email notifications on...Site Directory for SharePoint 2010 (from Microsoft Consulting Services, UK): v1.1: This is a minor update which includes the following changes: Code consolidation across the whole project Additional site data captured. See solut...Stardust: Stardust 1.0: First stable version of Stardust (Build 172)StardustExtensions: Facebook Extension: Extension for stardust to upload and post images on Facebook.StardustExtensions: Facebook Extension (Source): The source code of an extension for Stardust used to post images on facebook.StardustExtensions: WPF Example: This is an example extension. Uses WPF to create a Window and say "Hello World!" Is a perfect download if want to start writing Stardust ExtensionsStardustExtensions: WPF Example Source: This is the source code of an extension that creates a Window using WPF & displays a simple text. Is great as an example of creating Stardust Exten...TFTP Server: TFTP Server 1.1 Beta installer: New MSI based installer Installs a TFTP service Supports multiple servers on different endpoints, with every server pointing to its own root di...TiledLib: TiledLib 1.1: This download is for prebuilt DLLs and a demo project. For the full source code, use the Source Code tab. Changes: Bug fixes in a few methods Ad...Trance Layer: TranceLayer Digger: Digger version is a beta. It is intended to be used as a demonstration of muscles while lacking a set of features that are in the docs. The set of ...uManage - Active Directory Self-Service Portal: uManage v1.2 (.NET 4.0 RTM): New Releasev1.2 Adds the Administrative Portal as well as the requirement of a MSSQL database (2005+). The Setup Wizard has also been updated to i...Unoficcial NeoFM.hu NowPlaying: NeoNotifier: First release. Aplha, but usable.VidCoder: 0.2.1: Changes: Added 2-pass encoding Fixed x264 options getting mangled during p-invoke Fixed intermittent crash with logging window open due to thre...WCF RIA Services Contrib: WCF RIA Services Contrib RC2 Release: This version is for the WCF RIA Services RC2 (SL4 RTM) release. The ApplyState has been modifed in this version to disable validation during proces...WiiCIS.NET: WiiCIS.NET v0.2: Changes... - Removal of WiimoteManager, connection must be done manually - Accelerometer orientation was originally in degrees, is now in radians -...WinToolbar: WinToolbar Source code plus sample: This zip file contains the current version source code and libraries plus a testrunner (sample app).XP-More: 0.9 (Beta): Most of the functionality is in place, final polishing will be done soon.Most Popular ProjectsFacebook Developer ToolkitWSPBuilder (SharePoint WSP tool)QuickGraph, Graph Data Structures And Algorithms for .NetPerformance Analysis of Logs (PAL) Toolpatterns & practices: Team Development with Visual Studio Team Foundation ServerTFS Integration Platformpatterns & practices: Performance Testing Guidance for Web Applicationspatterns & practices: Enterprise Library ContribJSON ViewerManaged Wifi APIMost Active ProjectsRawrpatterns & practices – Enterprise LibraryIndustrial DashboardIonics Isapi Rewrite FilterFarseer Physics EngineMVVM Light ToolkitjQuery Library for SharePoint Web ServicesN2 CMSCaliburn: An Application Framework for WPF and SilverlightBlogEngine.NET

    Read the article

  • checking apt-get update lock file

    - by stewy613
    I have in stalled a dual boot beside windows and now I'm having a problem checking "apt-get update" when I type in apt-get update this is the outcome. I don't know what to do anthony@anthony-Inspiron-530s:~$ ls Desktop Downloads examples.desktop~ Pictures Templates Documents examples.desktop Music Public Videos anthony@anthony-Inspiron-530s:~$ apt-get update E: Could not open lock file /var/lib/apt/lists/lock - open (13: Permission denied) E: Unable to lock directory /var/lib/apt/lists/ E: Could not open lock file /var/lib/dpkg/lock - open (13: Permission denied) E: Unable to lock the administration directory (/var/lib/dpkg/), are you root? anthony@anthony-Inspiron-530s:~$ apt-get upgrade E: Could not open lock file /var/lib/dpkg/lock - open (13: Permission denied) E: Unable to lock the administration directory (/var/lib/dpkg/), are you root? anthony@anthony-Inspiron-530s:~$ cd apt-get update bash: cd: apt-get: No such file or directory anthony@anthony-Inspiron-530s:~$

    Read the article

  • Problem with reformatting Sandisk read-only USB drive

    - by dimas
    Hi everyone I have a problem reformatting my USB drive. Its showing this error message whenever I tried to reformat it using gparted: Parted 0.11.0 --enable-libparted-dmraid Libparted 2.3 Format /dev/sdb1 as ntfs 00:00:01 ( ERROR ) calibrate /dev/sdb1 00:00:01 ( SUCCESS ) path: /dev/sdb1 start: 22,768 end: 31,248,383 size: 31,225,616 (14.89 GiB) set partition type on /dev/sdb1 00:00:00 ( ERROR ) libparted messages ( INFO ) Unable to open /dev/sdb read-write (Read-only file system). /dev/sdb has been opened read-only. Unable to open /dev/sdb read-write (Read-only file system). /dev/sdb has been opened read-only. Unable to open /dev/sdb read-write (Read-only file system). /dev/sdb has been opened read-only. Unable to open /dev/sdb read-write (Read-only file system). /dev/sdb has been opened read-only. Unable to open /dev/sdb read-write (Read-only file system). /dev/sdb has been opened read-only. Unable to open /dev/sdb read-write (Read-only file system). /dev/sdb has been opened read-only. Unable to open /dev/sdb read-write (Read-only file system). /dev/sdb has been opened read-only. Unable to open /dev/sdb read-write (Read-only file system). /dev/sdb has been opened read-only. Can't write to /dev/sdb, because it is opened read-only. Unable to open /dev/sdb read-write (Read-only file system). /dev/sdb has been opened read-only. Reason I wanted to reformat it is because it just suddenly stopped working when I was transferring files from my pc and also it erased every content that was saved in the usb drive. I tried various tutorials on the net regarding this however I can't find a solution on how to make a usb drive change its read-only property to read-write or anything that would enable me to reformat this usb drive. I have also checked this link format read-only USB drive but this one doesn't have a solution. Also am attempting to do this on Ubuntu 12.04.

    Read the article

  • Can I legally publish my Fortran 90 wrappers to nVidias CUFFT library (from CUDA SDK)?

    - by Jakub Narebski
    From a legal standpoint (licensing issues), can I legally in agreement with license publish Fortran 90 wrappers (bindings) to CUFFT library from nVidia CUDA Toolkit, under some open source license (either CC0 i.e. public domain, or some kind of permissive license like BSD). nVidia provides only C bindings with their CUDA SDK. Header files contain the following text: /* * Copyright 1993-2011 NVIDIA Corporation. All rights reserved. * * NOTICE TO LICENSEE: * * This source code and/or documentation ("Licensed Deliverables") are * subject to NVIDIA intellectual property rights under U.S. and * international Copyright laws. * * These Licensed Deliverables contained herein is PROPRIETARY and * CONFIDENTIAL to NVIDIA and is being provided under the terms and * conditions of a form of NVIDIA software license agreement by and * between NVIDIA and Licensee ("License Agreement") or electronically * accepted by Licensee. Notwithstanding any terms or conditions to * the contrary in the License Agreement, reproduction or disclosure * of the Licensed Deliverables to any third party without the express * written consent of NVIDIA is prohibited. The License.txt file includes the following fragment Source Code: Developer shall have the right to modify and create derivative works with the Source Code. Developer shall own any derivative works ("Derivatives") it creates to the Source Code, provided that Developer uses the Materials in accordance with the terms and conditions of this Agreement. Developer may distribute the Derivatives, provided that all NVIDIA copyright notices and trademarks are propagated and used properly and the Derivatives include the following statement: "This software contains source code provided by NVIDIA Corporation."

    Read the article

  • Forwarding udp ports iptables packets "lost"?

    - by Dindihi
    I have a Linux router (Debian 6.x) where i forward some ports to internal services. Some tcp ports (like 80, 22...) are OK. I have one Application listening on port 54277udp. No return is coming from this app, i only get Data on this port. Router: cat /proc/sys/net/ipv4/conf/all/rp_filter = 1 cat /proc/sys/net/ipv4/conf/eth0/forwarding = 1 cat /proc/sys/net/ipv4/conf/ppp0/forwarding = 1 $IPTABLES -t nat -I PREROUTING -p udp -i ppp0 --dport 54277 -j DNAT --to-destination $SRV_IP:54277 $IPTABLES -I FORWARD -p udp -d $SRV_IP --dport 54277 -j ACCEPT Also MASQUERADING internal traffic to ppp0(internet) is active & working. Default Policy INPUT&OUTPUT&FORWARD is DROP What is strange, when i do: tcpdump -p -vvvv -i ppp0 port 54277 I get a lot of traffic: 18:35:43.646133 IP (tos 0x0, ttl 57, id 0, offset 0, flags [DF], proto UDP (17), length 57) source.ip > own.external.ip..54277: [udp sum ok] UDP, length 29 18:35:43.652301 IP (tos 0x0, ttl 57, id 0, offset 0, flags [DF], proto UDP (17), length 57) source.ip > own.external.ip..54277: [udp sum ok] UDP, length 29 18:35:43.653324 IP (tos 0x0, ttl 57, id 0, offset 0, flags [DF], proto UDP (17), length 57) source.ip > own.external.ip..54277: [udp sum ok] UDP, length 29 18:35:43.655795 IP (tos 0x0, ttl 57, id 0, offset 0, flags [DF], proto UDP (17), length 57) source.ip > own.external.ip..54277: [udp sum ok] UDP, length 29 18:35:43.656727 IP (tos 0x0, ttl 57, id 0, offset 0, flags [DF], proto UDP (17), length 57) source.ip > own.external.ip..54277: [udp sum ok] UDP, length 29 18:35:43.659719 IP (tos 0x0, ttl 57, id 0, offset 0, flags [DF], proto UDP (17), length 57) source.ip > own.external.ip..54277: [udp sum ok] UDP, length 29 tcpdump -p -i eth0 port 54277 (on the same machine, the router) i get much less traffic. also on the destination $SRV_IP there are only a few packets coming in, but not all. INTERNAL SERVER: 19:15:30.039663 IP source.ip.52394 > 192.168.215.4.54277: UDP, length 16 19:15:30.276112 IP source.ip.52394 > 192.168.215.4.54277: UDP, length 16 19:15:30.726048 IP source.ip.52394 > 192.168.215.4.54277: UDP, length 16 So some udp ports are "ignored/dropped" ? Any idea what could be wrong? Edit: This is strange: The Forward rule has data packets, but the PREROUTING rule has 0 packets... iptables -nvL -t filter |grep 54277 Chain FORWARD (policy DROP 0 packets, 0 bytes) 168 8401 ACCEPT udp -- * * 0.0.0.0/0 192.168.215.4 state NEW,RELATED,ESTABLISHED udp dpt:54277 iptables -nvL -t nat |grep 54277 Chain PREROUTING (policy ACCEPT 405 packets, 24360 bytes) 0 0 DNAT udp -- ppp0 * 0.0.0.0/0 my.external.ip udp dpt:54277 state NEW,RELATED,ESTABLISHED to:192.168.215.4

    Read the article

  • Crashplan not starting. Fails to find swt-gtk

    - by Pibben
    I've installed Crashplan on my (K)Ubuntu computer. However, it fails to start and the ui_output.log says: [09.02.12 15:24:43.518 INFO main root ] ************************************************************* [09.02.12 15:24:43.519 INFO main root ] ************************************************************* [09.02.12 15:24:43.524 INFO main root ] Loading lib/swt-64.jar, exists=true [09.02.12 15:24:43.525 INFO main root ] [file:/usr/local/crashplan/lib/com.backup42.desktop.jar, file:/usr/local/crashplan/lang/, file:/usr/local/crashplan/skin/, file:/usr/local/crashplan/lib/swt-64.jar] [09.02.12 15:24:43.527 INFO main root ] STARTED CrashPlanDesktop [09.02.12 15:24:43.528 INFO main root ] CPVERSION = 3.2.1 - 1332824401321 (2012-03-27T05:00:01:321+0000) [09.02.12 15:24:43.529 INFO main root ] ARGS = [ ] [09.02.12 15:24:43.531 INFO main root ] LOCALE = English (United States) [09.02.12 15:24:43.570 ERROR main com.backup42.desktop.CPDesktop ] Failed to launch CPDesktop; java.lang.UnsatisfiedLinkError: no swt-gtk-3557 or swt-gtk in swt.library.path, java.library.path or the jar file, java.lang.UnsatisfiedLinkError: no swt-gtk-3557 or swt-gtk in swt.library.path, java.library.path or the jar file java.lang.UnsatisfiedLinkError: no swt-gtk-3557 or swt-gtk in swt.library.path, java.library.path or the jar file at org.eclipse.swt.internal.Library.loadLibrary(Unknown Source) at org.eclipse.swt.internal.Library.loadLibrary(Unknown Source) at org.eclipse.swt.internal.C.<clinit>(Unknown Source) at org.eclipse.swt.internal.Converter.wcsToMbcs(Unknown Source) at org.eclipse.swt.internal.Converter.wcsToMbcs(Unknown Source) at org.eclipse.swt.widgets.Display.<clinit>(Unknown Source) at com.backup42.desktop.CPDesktop.<init>(CPDesktop.java:231) at com.backup42.desktop.CPDesktop.main(CPDesktop.java:161) [09.02.12 15:24:43.570 ERROR main root ] Failed to launch CPDesktop; java.lang.UnsatisfiedLinkError: no swt-gtk-3557 or swt-gtk in swt.library.path, java.library.path or the jar file I've installed the relevant (I think) swt-gtk packages.

    Read the article

  • Introduction to LinqPad Driver for StreamInsight 2.1

    - by Roman Schindlauer
    We are announcing the availability of the LinqPad driver for StreamInsight 2.1. The purpose of this blog post is to offer a quick introduction into the new features that we added to the StreamInsight LinqPad driver. We’ll show you how to connect to a remote server, how to inspect the entities present of that server, how to compose on top of them and how to manage their lifetime. Installing the driver Info on how to install the driver can be found in an earlier blog post here. Establishing connections As you click on the “Add Connection” link in the left pane you will notice that now it’s possible to build the data context automatically. The new driver appears as an option in the upper list, and if you pick it you will open a connection dialog that lets you connect to a remote StreamInsight server. The connection dialog lets you specify the address of the remote server. You will notice that it’s possible to pick up the binding information from the configuration file of the LinqPad application (which is normally in the same folder as LinqPad.exe and is called LinqPad.exe.config). In order for the context to be generated you need to pick an application from the server. The control is editable hence you can create a new application if you don’t want to make changes to an existing application. If you choose a new application name you will be prompted for confirmation before this gets created. Once you click OK the connection is created and you can start issuing queries against the remote server. If there’s any connectivity error the connection is marked with a red X and you can see the error message informing you what went wrong (i.e., the remote server could not be reached etc.). The context for remote servers Let’s take a look at what happens after we are connected successfully. Every LinqPad query runs inside a context – think of it as a class that wraps all the code that you’re writing. If you’re connecting to a live server the context will contain the following: The application object itself. All entities present in this application (sources, sinks, subjects and processes). The picture below shows a snapshot of the left pane of LinqPad after a successful connection. Every entity on the server has a different icon which will allow users to figure out its purpose. You will also notice that some entities have a string in parentheses following the name. It should be interpreted as such: the first name is the name of the property of the context class and the second name is the name of the entity as it exists on the server. Not all valid entity names are valid identifier names so in cases where we had to make a transformation you see both. Note also that as you hover over the entities you get IntelliSense with their types – more on that later. Remoting is not supported As you play with the entities exposed by the context you will notice that you can’t read and write directly to/from them. If for instance you’re trying to dump the content of an entity you will get an error message telling you that in the current version remoting is not supported. This is because the entity lives on the remote server and dumping its content means reading the events produced by this entity into the local process. ObservableSource.Dump(); Will yield the following error: Reading from a remote 'System.Reactive.Linq.IQbservable`1[System.Int32]' is not supported. Use the 'Microsoft.ComplexEventProcessing.Linq.RemoteProvider.Bind' method to read from the source using a remote observer. This basically tells you that you can call the Bind() method to direct the output of this source to a sink that has to be defined on the remote machine as well. You can’t bring the results to the LinqPad window unless you write code specifically for that. Compose queries You may ask – what's the purpose of all that? After all the same information is present in the EventFlowDebugger, why bother with showing it in LinqPad? First of all, What gets exposed in LinqPad is not what you see in the debugger. In LinqPad we have a property on the context class for every entity that lives on the server. Because LinqPad offers IntelliSense we in fact have much more information about the entity, and more importantly we can compose with that entity very easily. For example, let’s say that this code creates an entity: using (var server = Server.Connect(...)) {     var a = server.CreateApplication("WhiteFish");     var src = a         .DefineObservable<int>(() => Observable.Range(0, 3))         .Deploy("ObservableSource"); If later we want to compose with the source we have to fetch it and then we can bind something to     a.GetObservable<int>("ObservableSource)").Bind(... This means that we had to know a bunch of things about this: that it’s a source, that it’s an observable, it produces a result with payload Int32 and it’s named “ObservableSource”. Only the second and last bits of information are present in the debugger, by the way. As you type in the query window you see that all the entities are present, you get IntelliSense support for them and it’s much easier to make sense of what’s available. Let’s look at a scenario where composition is plausible. With the new programming model it’s possible to create “cold” sources that are parameterized. There was a way to accomplish that even in the previous version by passing parameters to the adapters, but this time it’s much more elegant because the expression declares what parameters are required. Say that we hover the mouse over the ThrottledSource source – we will see that its type is Func<int, int, IQbservable<int>> - this in effect means that we need to pass two int parameters before we can get a source that produces events, and the type for those events is int – in the particular case of my example I had the source produce a range of integers and the two parameters were the start and end of the range. So we see how a developer can create a source that is not running yet. Then someone else (e.g. an administrator) can pass whatever parameters appropriate and run the process. Proxy Types Here’s an interesting scenario – what if someone created a source on a server but they forgot to tell you what type they used. Worse yet, they might have used an anonymous type and even though they can refer to it by name you can’t figure out how to use that type. Let’s walk through an example that shows how you can compose against types you don’t need to have the definition of. This is how we can create a source that returns an anonymous type: Application.DefineObservable(() => Observable.Range(1, 10).Select(i => new { I = i })).Deploy("O1"); Now if we refresh the connection we can see the new source named O1 appear in the list. But what’s more important is that we now have a type to work with. So we can compose a query that refers to the anonymous type. var threshold = new StreamInsightDynamicDriver.TypeProxies.AnonymousType1_0<int>(5); var filter = from i in O1              where i > threshold              select i; filter.Deploy("O2"); You will notice that the anonymous type defined with this statement: new { I = i } can now be manipulated by a client that does not have access to it because the LinqPad driver has generated another type in its stead, named StreamInsightDynamicDriver.TypeProxies.AnonymousType1_0. This type has all the properties and fields of the type defined on the server, except in this case we can instantiate values and use it to compose more queries. It is worth noting that the same thing works for types that are not anonymous – the test is if the LinqPad driver can resolve the type or not. If it’s not possible then a new type will be generated that approximates the type that exists on the server. Control metadata In addition to composing processes on top of the existing entities we can do other useful things. We can delete them – nothing new here as we simply access the entities through the Entities collection of the application class. Here is where having their real name in parentheses comes handy. There’s another way to find out what’s behind a property – dump its expression. The first line in the output tells us what’s the name of the entity used to build this property in the context. Runtime information So let’s create a process to see what happens. We can bind a source to a sink and run the resulting process. If you right click on the connection you can refresh it and see the process present in the list of entities. Then you can drag the process to the query window and see that you can have access to process object in the Processes collection of the application. You can then manipulate the process (delete it, read its diagnostic view etc.). Regards, The StreamInsight Team

    Read the article

  • Server Firewall preventing sending of email [migrated]

    - by Jo Fitzgerald
    The firewall on my VPS appears to be preventing my site from sending email. It was working fine until the end of last month. My hosting provider (Webfusion) has been next to useless. I am able to send email if I open INPUT ports 32768-65535, but not if these ports are closed. Why would this be? I have the following rules in my firewall: # sudo iptables -L Chain INPUT (policy DROP) target prot opt source destination VZ_INPUT all -- anywhere anywhere Chain FORWARD (policy DROP) target prot opt source destination VZ_FORWARD all -- anywhere anywhere Chain OUTPUT (policy DROP) target prot opt source destination VZ_OUTPUT all -- anywhere anywhere Chain VZ_FORWARD (1 references) target prot opt source destination Chain VZ_INPUT (1 references) target prot opt source destination ACCEPT tcp -- anywhere anywhere tcp dpt:www ACCEPT tcp -- anywhere anywhere tcp dpt:https ACCEPT tcp -- anywhere anywhere tcp dpt:smtp ACCEPT tcp -- anywhere anywhere tcp dpt:ssmtp ACCEPT tcp -- anywhere anywhere tcp dpt:pop3 ACCEPT tcp -- anywhere anywhere tcp dpt:domain ACCEPT udp -- anywhere anywhere udp dpt:domain ACCEPT tcp -- anywhere anywhere tcp dpts:32768:65535 ACCEPT udp -- anywhere anywhere udp dpts:32768:65535 ACCEPT tcp -- localhost.localdomain localhost.localdomain ACCEPT udp -- localhost.localdomain localhost.localdomain Chain VZ_OUTPUT (1 references) target prot opt source destination ACCEPT tcp -- anywhere anywhere ACCEPT udp -- anywhere anywhere The VPS is running Plesk 10.4.4 (please ask if you require further technical information to help me)

    Read the article

< Previous Page | 249 250 251 252 253 254 255 256 257 258 259 260  | Next Page >