text_complexity_analyzer_cm.coh_metrix_indices.connective_indices
index
/home/hans/Proyectos/Python/TextComplexityAnalyzerCM/text_complexity_analyzer_cm/coh_metrix_indices/connective_indices.py

 
Modules
       
multiprocessing
pyphen
spacy
string

 
Classes
       
builtins.object
ConnectiveIndices

 
class ConnectiveIndices(builtins.object)
    ConnectiveIndices(nlp, language: str = 'es', descriptive_indices: text_complexity_analyzer_cm.coh_metrix_indices.descriptive_indices.DescriptiveIndices = None) -> None
 
This class will handle all operations to obtain the connective indices of a text according to Coh-Metrix
 
  Methods defined here:
__init__(self, nlp, language: str = 'es', descriptive_indices: text_complexity_analyzer_cm.coh_metrix_indices.descriptive_indices.DescriptiveIndices = None) -> None
The constructor will initialize this object that calculates the connective indices for a specific language of those that are available.
 
Parameters:
nlp: The spacy model that corresponds to a language.
language(str): The language that the texts to process will have.
 
Returns:
None.
get_additive_connectives_incidence(self, text: str, word_count: int = None, workers: int = -1) -> float
This method returns the incidence per {self._incidence} words for additive connectives.
 
Parameters:
text(str): The text to be analyzed.
word_count(int): The amount of words in the text.
workers(int): Amount of threads that will complete this operation. If it's -1 then all cpu cores will be used.
 
Returns:
float: The incidence of additive connectives per {self._incidence} words.
get_adversative_connectives_incidence(self, text: str, word_count: int = None, workers: int = -1) -> float
This method returns the incidence per {self._incidence} words for adversative connectives.
 
Parameters:
text(str): The text to be analyzed.
word_count(int): The amount of words in the text.
workers(int): Amount of threads that will complete this operation. If it's -1 then all cpu cores will be used.
 
Returns:
float: The incidence of adversative connectives per {self._incidence} words.
get_all_connectives_incidence(self, text: str, word_count: int = None, workers: int = -1) -> float
This method returns the incidence per {self._incidence} words for all connectives.
 
Parameters:
text(str): The text to be analyzed.
word_count(int): The amount of words in the text.
workers(int): Amount of threads that will complete this operation. If it's -1 then all cpu cores will be used.
 
Returns:
float: The incidence of all connectives per {self._incidence} words.
get_causal_connectives_incidence(self, text: str, word_count: int = None, workers: int = -1) -> float
This method returns the incidence per {self._incidence} words for causal connectives.
 
Parameters:
text(str): The text to be analyzed.
word_count(int): The amount of words in the text.
workers(int): Amount of threads that will complete this operation. If it's -1 then all cpu cores will be used.
 
Returns:
float: The incidence of causal connectives per {self._incidence} words.
get_logical_connectives_incidence(self, text: str, word_count: int = None, workers: int = -1) -> float
This method returns the incidence per {self._incidence} words for logical connectives.
 
Parameters:
text(str): The text to be analyzed.
word_count(int): The amount of words in the text.
workers(int): Amount of threads that will complete this operation. If it's -1 then all cpu cores will be used.
 
Returns:
float: The incidence of logical connectives per {self._incidence} words.
get_temporal_connectives_incidence(self, text: str, word_count: int = None, workers: int = -1) -> float
This method returns the incidence per {self._incidence} words for temporal connectives.
 
Parameters:
text(str): The text to be analyzed.
word_count(int): The amount of words in the text.
workers(int): Amount of threads that will complete this operation. If it's -1 then all cpu cores will be used.
 
Returns:
float: The incidence of temporal connectives per {self._incidence} words.

Data descriptors defined here:
__dict__
dictionary for instance variables (if defined)
__weakref__
list of weak references to the object (if defined)

 
Data
        ACCEPTED_LANGUAGES = {'es': 'es_core_news_lg'}
Callable = typing.Callable
List = typing.List