Joint extraction of entities and relations from unstructured text is an important task in information extraction. The existing methods have achieved considerable performance, but are still subject to some inherent limitations, such as error propagation, redundancy of relation prediction, inability to solve the problem of relations overlap, etc. For this reason, this paper proposed a joint entity relationship extraction model BSGB( BiLSTM + SDA-GAT + BiGCN) based on graph neural network. BSGB was a two-stage predicting process. The first stage of this model extended the semantic dependency analysis to the semantic dependency graph, and proposed a graph attention network to integrate the semantic dependency graph(SDA-GAT). By stacking BiLSTM and SDA-GAT, it extracted sentence sequence and local dependent features, and performed entity span detection and preliminary relationship prediction. In the second stage, it constructed the relation-weighted GCN, which further modeled the interaction between entities and relations, and completed the final extraction of entity-relationship triples. The experimental results on the NYT dataset show that the F1 value of this model reaches 67 .1 %, which is 5. 2% higher than the baseline model in this dataset, and the prediction of the overlap relation is also significantly improved. [ABSTRACT FROM AUTHOR]